| By Loc Le |
On September 27th, Meta Platforms Chief Executive Mark Zuckerberg spoke at the company’s Meta Connect conference to announce their latest technological innovations for consumers. Zuckerberg took the stage at Meta’s Silicon Valley campus to reveal their new AI assistant and smart glasses which both aim to bridge the gap between the virtual and real worlds.
Building from Beta to Bing
Built using a custom model of the company’s Llama 2 large language model, the AI assistant, known as Meta AI, will be a chatbot capable of generating photo-realistic images and text responses based off of users’ text prompts as well as retrieving real-time information through the Bing search engine. However, Zuckerberg emphasized that Meta AI “isn’t just going to be about answering queries” and that their focus is on entertainment and “helping you do things to connect with the people around you.” Although it is currently in a beta stage, users can now request early access to use the AI tool on WhatsApp, Messenger, and Instagram.
Privacy Concerns and Data Transparency
Given Meta’s controversial history surrounding data and privacy issues with cases such as the Cambridge Analytica scandal, Meta’s president of global affairs Nick Clegg participated in an interview with Reuters to address concerns regarding Meta AI. According to Clegg, part of the AI assistant has been trained using Facebook and Instagram posts that are publicly available while private posts and chats have been excluded. Public datasets that have been filtered to exclude private information has also been used to train the AI assistant. Using LinkedIn as an example of a website excluded due to privacy concerns, Clegg stated that they have “tried to exclude datasets that have a heavy preponderance of personal information.” As Meta continues their development of Meta AI and additional AI applications, it is important for the company to continue its efforts toward data transparency to ease the minds of its customers.
Integration with Smart Glasses
Meta AI will also be directly integrated into Meta’s new Ray-Ban smart glasses which are slated to start shipping on Oct. 17th at a price tag of $299. In addition to the AI assistant, the smart glasses will feature the ability for users to broadcast livestreams of what they are seeing to Facebook and Instagram through the use of a 12 MP camera and five-mic system for audio. By next year, the company is also planning to roll out a software update that will allow users to identify places and objects and translate languages with Meta AI.