Additional features, such as voice messages and timers, are being implemented, but live translation is not yet available.
For consumers in the United States of America and Canada, Meta is introducing some of the functions that were previously mentioned for its artificial intelligence-powered Ray-Ban smart spectacles. In a post on Threads, Chief Technology Officer Andrew Bosworth stated that the latest upgrade to the glasses incorporates increased natural language recognition. This means that the stilted commands of “Hey Meta, look and tell me” should no longer be necessary. It will be possible for users to interact with the artificial intelligence assistant without the “look and” component of the invocation.
Today marks the arrival of the majority of the other artificial intelligence capabilities that were showcased during the Connect event that took place a month ago. Voice messages, timers, and reminders are all included in this category. In addition, the glasses can be utilized to instruct Meta AI to contact a specific phone number or to scan a QR code. In a video that was posted to Instagram, CEO Mark Zuckerberg showed the new reminders features by showing how they might be used to locate your vehicle in a parking garage. The live translation feature is one of the most noteworthy features that has been left out of this version; nevertheless, Bosworth did not provide a schedule outlining when this feature will be ready.
Two students from Harvard University utilized Meta’s smart glasses to effectively dox complete strangers, which caused them to make news for the first time today. Using a mix of facial recognition technology and a massive language processing model, they were able to uncover information such as addresses, phone numbers, details about family members, and partial Social Security numbers.