0

New Ray-Ban Stories Will Reportedly Support First Person Livestreaming

 11 months ago
source link: https://www.uploadvr.com/ray-ban-stories-2-livestreaming/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

New Ray-Ban Stories Will Reportedly Support LivestreamingSkip to content

The second generation Ray-Ban Stories will reportedly have better cameras and support first person livestreaming.

Meta and Ray-Ban owner Luxottica officially announced work on new smart glasses in October last year.

Journalist Janko Roettgers says he viewed internal Meta documents detailing the improvements and new features of the upcoming device.

The current Ray-Ban Stories are camera glasses for taking hands-free first person photos and videos. They also have speakers and a microphone for music and phone calls but there is no display of any sort. Snapchat has been selling successive generations of a similar product, Spectacles, since 2017.

Roettgers reports the second generation Ray-Bans won't have a display either, but will have higher quality cameras, longer battery life, and an anti-tamper mechanism to disable capturing images or videos when the front LED is covered.

The glasses will also support livestreaming to Instagram and Facebook, with viewer comments read out by an assistant via the built-in speakers, according to the report.

Ray-Ban-CWO-Meta.png

In March The Verge reported that Meta’s VP of AR Alex Himel told staff the company planned to release third generation glasses in 2025 with a display and neural input wristband.

Called the “viewfinder”, this heads-up display will reportedly be used to show notifications, scan QR codes, and translate real-world text in real time. To be clear: this wouldn’t be true AR, it would be a small floating contextual display.

The neural wristband is based on the tech from CTRL-Labs, a startup Facebook acquired in 2019, and Meta has openly discussed its development. It works by using using EMG (electromyography) to read the neural signals passing through your arm from your brain to your fingers. Such a device could sense even incredibly subtle finger movements not clearly perceptible to people nearby. Himel reportedly said it will let the wearer “control the glasses through hand movements, such as swiping fingers on an imaginary D-pad”.

8056597557108__STD__shad__qt.webp

Earlier this month though The Wall Street Journal reported that less than 10% of the current first generation Ray-Ban Stories are being "used actively".

Meta reportedly isn't giving up though it seems, and the company likely hopes the new features and improvements can draw in a wider audience and retain active users.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK