A fanciful mockup of digital glasses via TechAcute
According to Kuo, Apple's AR glasses will be marketed as an iPhone accessory and primarily take a display role while wirelessly offloading computing, networking, and positioning to the iPhone.
Designing the AR glasses to work as an iPhone accessory is also expected to allow Apple to keep the glasses slim and lightweight, rather than trying to pack in all the processing hardware into the one device.
Kuo believes Apple is aiming to begin mass-producing the glasses as early as the fourth quarter of this year, although he admits the timeframe could be pushed back to the second quarter of 2020.
Back in November 2017, Bloomberg reported that Apple was developing an AR headset and aimed to have it ready by 2019, although the company could ship a product in 2020. The report said the headset would run on a new custom operating system, based on iOS, and dubbed "rOS" for "reality operating system."
Apple has been exploring virtual reality and augmented reality technologies for more than 10 years based on patent filings. The company is also rumored to have a secret research unit comprising hundreds of employees working on AR and VR, exploring ways the emerging technologies could be used in future Apple products.
Apple CEO Tim Cook has talked up the prospect of augmented reality several times, saying he views AR as "profound" because the technology "amplifies human performance instead of isolating humans."
The U.S. Patent and Trademark Office on Tuesday granted Apple a patent describing a "method for representing points of interest in a view of a real environment on a mobile device," and while there is no specific mention of so-called Apple Glasses, the patent describes a "head-mounted display."
As with many other augmented reality devices, the head-mounted display would be able to overlay computer-generated virtual information onto a view of the real environment. More specifically, the headset would have a camera that is able to identify and annotate points of interest and other objects.
One illustration in the patent shows a head-mounted display showing buildings, each identified with an overlaying label. On a paired iPhone, a user would be able to tap on the point of interest to view additional information.
While the head-mounted display looks like a pair of snowboarding goggles, patent illustrations are merely examples.
Apple files numerous patent applications every week, of course, and many of the inventions do not see the light of day. Patents are also very detailed, encompassing many possible ideas, even ones that Apple might not have any plans to advance. So, the exact implementation if any remains to be seen.
At this point, it's not entirely clear if Apple is working on Google Glasses-like glasses or a HoloLens-like headset. Apple CEO Tim Cook has expressed more of an interest in augmented reality than virtual reality, however, and the patent does suggest that Apple is focused on augmenting the real world.
Apple has named longtime Apple employee and iPhone executive Frank Casanova as its first head of marketing for augmented reality, reports Bloomberg.
Casanova, who has been at Apple since 1988, is responsible for all aspects of product marketing for Apple's "augmented reality initiative," according to his LinkedIn profile.
Prior to being named head of Apple's AR marketing effort, Casanova worked as Apple's senior director of iPhone partner marketing. He started at Apple as a product manager in May 1988, spent a short stint at another company for a year in 1997, and has been working at Apple since then. He was around for the launch of the iPhone as well as many other pivotal products.
As Bloomberg points out, Apple's decision to name a head of product marketing for augmented reality indicates the importance of the feature for the future of the company. Apple debuted ARKit, its augmented reality platform, in iOS 11, and made significant improvements to it in iOS 12.
ARKit turned Apple's iPhones and iPads the largest augmented reality platform available, with many apps now taking advantage of augmented reality capabilities.
Avi Bar-Zeev, who was the co-creator of Microsoft's HoloLens, has left his position at Apple, reports Variety.
Bar-Zeev was reportedly working on Apple's augmented reality headset, which rumors have suggested could launch as early as 2020. Bar-Zeev left his position at Apple in January and provided the following statement to Variety:
"I left my full-time position at Apple in January. I had the best exit one can imagine. I have only nice things to say about Apple and won't comment on any specific product plans."
Prior to joining Apple, Bar-Zeev worked at Microsoft and helped to found and invent the HoloLens, Microsoft's mixed reality headset. Before that, he worked at Disney and helped develop VR experiences, and he worked at Keyhole, a company that was purchased by Google and became the foundation of Apple Maps.
Bar-Zeev had been at Apple since 2016, presumably on the AR/VR team. His LinkedIn profile said that he led the "experience prototyping" team "for a new effort."
"Developed key prototypes to rapidly prove concepts, explore, educate and build support. Developed user stories and technical requirements for the long-term roadmap, while working across design and engineering to ensure success," reads Bar-Zeev's profile.
Since an intrepid reporter just asked me to comment, I'll share this with y'all too...
I left my full-time position at Apple in January. I had the best exit one can imagine. I have only nice things to say about Apple and won't comment on any specific product plans. 1/2
For the record, I helped invent Hololens and built the first prototypes to sell the idea. If I hadn't been there in January 2010, it might have started later without my help. Let's never forget the thousands of smart people who made it real.
More than half a dozen incognito Apple representatives, including employees of known subsidiaries, visited AR waveguide suppliers like DigiLens, Lumus, Vuzix and WaveOptics at CES 2019, according to a person with knowledge of the meetings.
Apple has been known to be interested in AR / VR technology for years with reports of hundreds of employees working on the technology. Tim Cooknotably said in 2017 that the technology to do AR glasses in a "quality way" didn't yet exist, indicating that they would wait until Apple could deliver the best experience.
Rumors of an Apple AR Headset reignited when a report in April of 2018 suggested that a headset was actively being developed with a launch target of 2020. Since that rumor, it was revealed that Apple had also purchased Akonia Holographics, a startup that makes lenses for augmented reality glasses.
Kim's work history includes user interface design at Microsoft, contributions to HoloLens and Xbox One S, and designs for Tesla's Model 3, S, X, and Y. His LinkedIn profile confirms his move to Apple this month, but doesn't specify what he'll be doing at the company outside of being a "Designer." Due to his history, it could be work on Apple's rumored AR glasses, Project Titan, or something else entirely.
Project Titan is Apple's long-rumored vehicle project, which is believed to have originated in 2014 and could see a consumer Apple Car available between 2023 and 2025, according to Ming-Chi Kuo. This estimate came from a report in August, but previous rumors suggested that focus on Project Titan has shifted to autonomous driving software for cars instead of a vehicle specifically designed by Apple.
Kuo's report and news about Apple's hiring of Tesla employees -- now including Kim -- have reignited the speculation that Apple could again be planning to build its own vehicle. If accurate, the future Apple Car would have Apple's autonomous driving software built inside of a car designed by the Cupertino company.
Looking ahead to next year, Ming-Chi Kuo believes that contrary to some analysts' expectations, Apple is unlikely to integrate a time-of-flight (ToF) depth sensing system in the rear camera of its 2019 iPhones.
Reports of Apple including rear-facing 3D sensing capabilities in its 2019 iPhone lineup first began appearing last year. At the time, Apple was said to be evaluating a ToF approach, which is different from the one it currently uses in the TrueDepth sensor system on the front of the iPhone X.
TrueDepth relies on a structured-light technique that projects a pattern of 30,000 laser dots onto a user's face and measures the distortion to generate an accurate 3D image for authentication. By contrast, ToF calculates the time it takes for a laser to bounce off surrounding objects to create a 3D image of the environment.
However, in a note to investors this morning, obtained by MacRumors, Kuo said that the company would not use a rear-side application of ToF technology in next year's smartphone lineup for two reasons.
First, the distance and depth information provided by rear-side ToF is currently too immature for creating the "revolutionary AR experience" that Apple ultimately wants to develop. Kuo believes that in addition to camera distance/depth data, the complete AR ecosystem Apple is aiming for requires 5G connectivity, augmented-reality glasses, and a "more powerful Apple Map database".
Second, the addition of rear-side ToF would do little to improve the iPhone's photo-taking capabilities, because the current dual-camera setup is already capable of capturing or simulating enough distance/depth information to make photo-features like Portrait Mode a reality.
"We believe that iPhone's dual-camera can simulate and offer enough distance/depth information necessary for photo-taking; it is therefore unnecessary for the 2H19 new iPhone models to be equipped with a rear-side ToF," Kuo concludes.
Last October, Kuo similarly pushed back against reports that Apple would expand its front-facing TrueDepth system to the rear camera on this year's iPhone lineup, which is due to be announced later today. Given Kuo's latest remarks on Apple's smartphone roadmap, rumors of rear-camera 3D sensing capabilities for next year's lineup seem just as unlikely.
Apple has purchased a startup that makes lenses for augmented reality glasses, reports Reuters. The acquisition lends further evidence to rumors suggesting Apple is developing an augmented reality headset.
Akonia Holographics, the startup that Apple bought, advertises the "world's first commercially available volume holographic reflective and waveguide optics for transparent display elements in smart glasses."
The displays that it makes are said to use the company's HoloMirror technology for "ultra-clear, full-color performance" to enable the "thinnest, lightest head worn displays in the world."
Apple confirmed the purchase to Reuters with the statement that it uses when it makes an acquisition: "Apple buys smaller companies from time to time, and we generally don't discuss our purpose or plans."
Akonia Holographics was founded in 2012 by holography scientists who focused on holographic data storage before moving on to develop displays for augmented reality glasses, according to the company's website.
It's not clear when exactly Apple purchased Akonia Holographics, but sources that spoke to Reuters suggested the company had become "very quiet" over the last six months, indicating the acquisition may have been made in the first half of 2018.
Multiple rumors have suggested Apple has a research unit of hundreds of employees working on AR and VR and exploring the ways the emerging technologies could be used in future Apple products.
Apple is said to be exploring several prototypes, including a powerful AR/VR headset with an 8K display for each eye and a set of augmented reality smart glasses with a dedicated display, a built-in processor, and an "rOS" or reality operating system. Rumors indicate that Apple's first AR or VR product could come out in 2019 or 2020.
Apple's purchase of Akonia Holographics is its second recent AR/VR related acquisition. In November 2017, Apple purchased Vrvana, a company that developed a mixed reality headset called Totem.
There have also been multiple reports of Apple's work on augmented reality smart glasses with a dedicated display, a built-in processor, and an "rOS" or reality operating system based on iOS.
Prior to developing his Cyber Paint app, Crispin served as a lead UX designer at DAQRI, where he worked on software developed for augmented reality and head mounted displays, and before that, he was a freelance VR developer, so he has both AR and VR experience.
Apple has made several AR/VR-related hires and acquisitions in recent years, all of which are outlined in our dedicated AR/VR roundup. Acquisitions include Vrvana, a company that developed a mixed reality headset called Totem.
It's not entirely clear when we might see an AR or VR headset from Apple, but multiple rumors have suggested Apple is aiming for a 2019 to 2020 launch date for such a device.
Engineer and popular YouTuber Mark Rober, who is known for his science-related videos that can rack up millions of views, works at Apple as an engineer in the special projects group, reports Variety.
The site says that Rober has been working on Apple's virtual reality projects, including "using VR as on-board entertainment for self-driving cars." On Rober's LinkedIn page, it says he works as a product designer at an unspecified company he first joined in 2015, suggesting he's been with Apple for some time.
To explain the kinds of things Rober may be working on, Variety points to a pair of Apple patent applications that cover an "Immersive Visual Display" and an "Augmented Virtual Display," which were filed in 2016 and describe virtual reality systems that could be used by passengers in self-driving cars. The patents list Mark Rober as a primary inventor.
Both of the patents describe a VR headset that could help alleviate in-car motion sickness in autonomous vehicles, with one suggesting replacing the view of the real world with virtual environments that include visual cues to match physical motions the passenger is experiencing and the other describing virtual content that appears as a fixed object in the external environment.
One of the patents suggests that a virtual reality system for cutting down on motion sickness could aid in productivity because it would allow passengers (which would include all persons in an autonomous vehicle as a driver would not be required) to perform work while the vehicle is in motion without experiencing motion sickness. It also suggests VR could provide "enhanced virtual experiences" to passengers.
Many passengers in vehicles may experience motion sickness. Typically, this is not the case for the driver. However, with the arrival of autonomous vehicles, the driver becomes a passenger, and thus may want to occupy themselves while, for example, riding to work. Passengers in conventional or autonomous vehicles may, for example, want to read a book, or work on their notebook computer.
Apple is currently working on autonomous driving software that is being tested in Lexus SUVs that are out on the road near its Cupertino headquarters, and the technology is reportedly being implemented into employee shuttles.
Apple has inked a deal with Volkswagen to use Volkswagen T6 Transporter vans as self-driving shuttles to transport employees around its various campuses and office buildings in the San Francisco Bay Area. It's not clear if and when Apple plans to implement the VR technology Rober is working on into the shuttles or other future autonomous car projects, but there are many concepts that Apple patents that never make it into finished products.
Prior to joining Apple, Rober spent eight years as mechanical engineer at NASA's Jet Propulsion Laboratory, and he also served as a product designer at Morph Costumes.
He also maintains a popular YouTube channel with 3.4 million subscribers, sharing science-related videos like "Lemon Powered Supercar," "How to Survive a Grenade Blast," "How Much Pee is in Your Pool," and "iPhone ATM PIN Code Hack - How to Prevent."