By Emily LechnerApple and Google are teaming up to bring a host of new technologies into smartphones, including a camera sensor that can detect the eye color of someone’s face.
The two tech giants have teamed up to create the EyeSense platform, which aims to bring cameras to the masses, according to the company.
The platform allows people to see in the dark without the need for special glasses, and Google says the technology could one day be used in a variety of devices.
It’s an ambitious goal, but Google is hoping to turn this into something that could be used to bring in more of the tech giants’ products to the mass market, including Google Glass.
That’s because, in some ways, Google is already using the technology to build a smartwatch.
That company has also announced that it plans to build an eye-tracking device called the “EyeSense Wearable,” a product that would allow users to wear their eyes on their head.
While the EyeSight tech might seem like a big deal for people who already have Glass or have never worn it, Google’s move comes at a time when Google is also trying to sell more of its Android phones to businesses.
This month, the company launched its own line of Android phones.
While these two tech companies might seem to be competing to build more of their own products, Google says it will continue to work with Apple and Google to create more consumer-friendly products.
“We’re always interested in being a partner, but we’re not going to do it with a single partner,” said Google COO Eric Schmidt in an interview with The Verge.