Amazon Music is now accessible on the glasses thanks to the most recent update.
The Ray-Ban Meta smart glasses have been updated by Meta to provide additional hands-free capabilities. The first of these new capabilities is a new function that enables users to share photographs as Instagram Stories without having to remove their phone from their pocket. It is sufficient to just say, “Hey Meta, share my last photo to Instagram,” if you have already taken the picture that you wish to share on Instagram. You can also say “Hey Meta, post a photo to Instagram” if you want to be more spontaneous and capture a picture to share as a Story on the spot. This is another option. This is for those moments that you don’t mind sharing with your followers in real time, without any editing taken into account.
Additionally, you will now have the ability to get your glasses to play your music files from Amazon Music in a short amount of time. Only by saying “Hey Meta, play Amazon Music” will you be able to begin listening to music through the open-ear audio system of the smart glasses. Additionally, the answer is yes; you will be able to adjust the audio using either the touch controls on the device or your voice. In the event that you have a Calm account and are in need of getting some much-needed relaxation, you can use your smart glasses to listen to guided meditation or mindfulness activities instead. In order to accomplish this, simply say, “Hey Meta, play the Daily Calm.” And if you don’t already have a Calm account, you may receive a free subscription to the service for a period of three months if you follow the instructions that appear on the screen of the Meta View app. It is being said that all of these features are “rolling out gradually,” which means that if you do not now have access to them, you will soon get it.
Meta also released multimodal artificial intelligence for the Ray-Ban smart glasses last month, following months of testing on the product. As is the case with the Rabbit R1 and the Humane AI Pin, it makes it possible for the smart glasses to function as a personal artificial intelligence device that is separate from the smartphone. You may now ask the smart glasses to read signs in different languages, identify landmarks, and describe items in the surroundings. This upgrade is especially helpful for people who travel frequently since it allows them to read signs in a variety of languages simultaneously. Additionally, Meta enabled the smartphone to make video calls with WhatsApp and Messenger without the need for the user to use their hands.