Meta Connect Keynote 2025
文章语言:
简
繁
EN
Share
Minutes
原文
会议摘要
Discussions revolve around the future of immersive storytelling through 3D filmmaking and AI-integrated virtual reality. Highlights include advancements in 3D technology, democratization of content creation, and the potential for AI to transform interactive media, with emphasis on partnerships and upcoming projects.
会议速览
AI glasses, offering personal superintelligence through realistic holograms, are poised to transform the metaverse. These devices, seamlessly integrating into daily life, enhance memory, senses, and communication. With partners like EssilorLuxottica, the tech is set to deliver real-time UIs directly into vision, marking a significant leap in wearable technology.
The next generation of Ray-Ban Meta glasses introduces double battery life, 3K video recording, and Conversation Focus, an AI feature that amplifies voices in noisy environments, enhancing user experience and accessibility across new and existing models.
Discusses strategies to appear more relaxed and natural when having one's photo taken, emphasizing casual actions like adjusting clothing to ease camera anxiety.
A live AI demonstration in a kitchen setting shows the AI assisting in creating a Korean inspired steak sauce, highlighting the technology's potential for everyday use, despite current limitations on continuous operation.
The dialogue discusses the latest advancements in tech-infused eyewear, including new color options for Ray-Ban Meta, Oakley Meta Vanguard's enhanced performance features, and collaborations with Garmin and Strava for automatic video capture and stats overlay. It highlights the intersection of fashion and technology, emphasizing the glasses' style, functionality, and water resistance, while showcasing their integration into sports and adventure activities.
Inquires about the world's record for the longest wakeboard rail slide, followed by a speed limit and current speed update, then a promotional announcement for Oakley Meta Vanguard glasses, priced at $4.99, with a shipping date set for October 21.
Meta introduces its latest innovation, the Ray-Ban AI glasses, featuring a high-resolution display and a groundbreaking neural interface. These glasses offer a unique way to interact silently through brain signals, enabling functions like messaging and video calls without voice, all while maintaining a classic style. The technology behind the display and the neural interface represents significant advancements in wearable computing.
Encountering video call difficulties, the speaker opts for alternative communication methods, decides to proceed with a demonstration, and creatively selects background music to maintain engagement.
A discussion on using Ray-ban Metta for enhancing communication by adjusting focus and volume on friends, and the potential of displaying subtitles in real-world interactions, amidst technical hiccups and debugging efforts.
The dialogue highlights the benefits of using subtitles for better TV comprehension, the potential of real-time translation for multilingual communication, and the introduction of a viewfinder feature in Ray-Ban Metas for improved photo and video capture. It showcases the convenience of the glasses' compact design and charging mode, emphasizing their role in enhancing user experience in various scenarios.
The dialogue highlights the integration of high-resolution meta displays and neural bands, showcasing their potential to revolutionize video chat and media consumption by offering a seamless and immersive experience.
AI glasses, featuring live AI sessions and high-resolution displays, promise to streamline daily conversations by assisting with follow-ups. Additionally, AI advancements are set to democratize immersive content creation, transforming virtual reality and social media experiences.
Meta Horizon Studio introduces AI tools for mesh, texture, and audio generation, paired with a new engine for faster, higher-quality virtual experiences. Users can scan rooms with Quest headsets to create immersive worlds, blending hyperscale spaces seamlessly.
Meta introduces Horizon Engine, enhancing virtual reality with customizable homes, quick world loading, increased concurrency, and interactive social experiences, setting the stage for future innovations in immersive technology.
Content Quest highlights its virtual reality games lineup, introducing Horizon TV, a new entertainment hub featuring Disney Plus, Hulu, ESPN, and Universal Studios' horror movies. With support for Dolby Atmos and Dolby Vision, Horizon TV offers immersive 3D experiences. The dialogue concludes with anticipation for immersive storytelling's role in virtual reality adoption.
The dialogue highlights a deep appreciation for 3D filmmaking and technological advancement, tracing the origins of this passion back two decades. It emphasizes the belief in 3D storytelling as a means to fulfill creative visions, showcasing a commitment to pushing boundaries in cinematic technology.
Discusses advancements in 3D filmmaking, emphasizing the potential of VR headsets for high-quality, theater-grade viewing experiences, surpassing traditional projection standards.
The dialogue highlights advancements in display technology, particularly the Quest series, which offer enhanced brightness, dynamic range, and color space, creating an immersive experience akin to a private movie theater. The discussion also touches on Horizon TV's potential, attributing its current success to the convergence of technological improvements, making it a formidable competitor to traditional TV and flat displays.
The dialogue highlights the potential of 3D technology to significantly boost emotional engagement and presence in content, envisioning a future where stereo ubiquity in entertainment, news, and live events is imminent. It emphasizes the role of partnerships in advancing tools and systems for filmmakers to integrate 3D into episodic and cinematic productions, focusing on the immediate application of this technology to meet growing demand.
Discusses the evolution of production technology, focusing on reducing costs, improving accessibility for creators, and enhancing viewer experiences through smarter hardware and software solutions. Highlights the shift from high-end productions to more conventional, user-friendly systems, aiming to democratize high-quality content creation.
Reflecting on three decades of digital innovation, the dialogue highlights the foresight of pioneers who envisioned a future dominated by digital media. From early 90s CG advancements to 3D movie theaters, the narrative underscores the relentless pursuit of progress, overcoming initial skepticism and resistance to new technologies. The journey from conceptual dreams to industry-wide adoption is likened to riding a wave of technological evolution, illustrating the transformative power of visionary thinking and perseverance.
An exciting 3D clip from Avatar Fire and Ash was showcased at the Connect event, available for attendees to view on Meta Quest devices through Horizon TV for a limited time, emphasizing the anticipation for its December 19 theatrical release.
A live event concludes with anticipation for Avatar's Fire and Ash release, new Meta Horizon Studio experiences, and an invitation to join a DJ's running club, emphasizing interactive and creative tech advancements.
要点回答
Q:Why are AI Glasses considered the ideal form factor for personal superintelligence?
A:AI Glasses are considered the ideal form factor for personal superintelligence because they allow users to remain present while accessing AI capabilities that enhance intelligence, communication, memory, and senses. They also enable an AI to see, hear, talk to the user throughout the day, and generate UI in real-time.
Q:What are the key design values of the AI Glasses?
A:The key design values of the AI Glasses include being great glasses first, focusing on aesthetic design and comfort, and being light; the technology should not overshadow the user's presence with others; and taking superintelligence seriously, designing the glasses to empower people with new capabilities and updating software to enhance intelligence and personalization.
Q:What advancements have been made in terms of battery life and video recording in the new Ray-Ban Meta glasses?
A:The new Ray-Ban Meta glasses have double the battery life and can record 3K video, which is sharper, smoother, and more vivid than before.
Q:What is Conversation Focus and when will it be available?
A:Conversation Focus is a new feature that will amplify a friend's voice in a noisy environment, allowing users to turn up the volume on who they are speaking to. It will be available on the new generation of Ray-Ban Meta glasses and as a software update on all existing Ray-Ban Metas.
Q:How will Meta AI evolve in terms of constant use and what challenges are still being worked on?
A:Meta AI is evolving from being invoked only when needed to becoming a service that runs all day and helps users throughout the day. The company is still working on the major technology challenge of implementing all-day live AI use, but currently, users can use live AI for about an hour or two straight.
Q:What fashion label recently debuted a look centered on Ray-Ban Meta at New York Fashion Week?
A:Fashion label Lui, run by Raoul Lopez, recently debuted a look centered on Ray-Ban Meta at New York Fashion Week.
Q:Which iconic brand has collaborated with Meta to launch AI glasses and what are their new features?
A:Oakley has collaborated with Meta to launch AI glasses, including the new Oakley Meta Vanguard, which has features like an iconic Oakley aesthetic designed for performance, a longer battery life, and advanced camera and audio capabilities.
Q:What unique capabilities do the new Oakley Meta Vanguard glasses have?
A:The new Oakley Meta Vanguard glasses have features like a wider 122-degree field of view, 3K video capture, built-in stabilizations, powerful open ear speakers, six decibel louder than the previous model, advanced wind noise reduction, slow motion and hyperlapse capture modes, auto-capture for athletes using Garmin devices, Strava integration, an LED for peripheral vision, and are water-resistant with an IP67 rating.
Q:What partnerships have been announced for the new Oakley and Ray-Ban Meta glasses?
A:Partnerships announced for the new Oakley and Ray-Ban Meta glasses include auto-capture with Garmin devices and Strava integration, as well as customizable Oakley Prism shield lenses.
Q:What is the distinctive feature of the Meta Ray-Ban display and how does it interact with the user?
A:The Meta Ray-Ban display is distinctive for being the first AI glasses with a high-resolution display and a new way to interact with the user through the Meta Neural Band. The display appears in one eye, is off-center to not block the user's view, disappears when not in use, and features a high resolution and brightness suitable for both indoor and outdoor use.
Q:What breakthrough technology is included in the Meta Ray-Ban display and what are its capabilities?
A:The Meta Ray-Ban display includes the Meta Neural Band, a breakthrough in neural interface technology that allows users to silently control their glasses with minimal muscle movements. The neural band is integrated into a durable, lightweight, and comfortable wristband with 18 hours of battery life and water resistance.
Q:What are the primary functions of the Meta Ray-Ban display that enhance the user's messaging experience?
A:The primary functions of the Meta Ray-Ban display that enhance the user's messaging experience include the ability to send and receive messages, dictating voice clips without being heard by others, and typing responses covertly, which can be particularly useful in social situations.
Q:What challenges are associated with the current video call functionality?
A:The challenges associated with the current video call functionality include missed video calls and difficulties in connecting, as indicated by the speaker's initial frustration and the need for repeated attempts to establish a video call.
Q:What new capabilities are provided by the Meta AI integration?
A:The new capabilities provided by Meta AI integration include the ability to turn up the volume on a friend in a conversation, the display of subtitles for the hearing impaired, real-time translation, and the ability to see and manage photos and videos taken with the camera.
Q:What specific features related to the camera are highlighted?
A:The specific features highlighted related to the camera are the ability to see the picture before taking it, the ability to see the picture after taking it before sharing it, and the incorporation of a viewfinder.
Q:How does the speaker anticipate the AI integration will help with follow-up actions?
A:The speaker anticipates that with the AI integration, users will be able to start a live AI session where their glasses will be able to see and hear what they do, think about it, and then come back with assistance, thereby helping with follow-up actions while in the middle of a conversation.
Q:What real-life scenarios are used to demonstrate the utility of the AI integration?
A:The real-life scenarios used to demonstrate the utility of the AI integration include a conversation about building a surfboard, where the AI helps by researching and confirming design details, and a discussion about the feasibility of incorporating specific features into the surfboard design.
Q:What are the new features of Meta's AI glasses?
A:Meta's new AI glasses come with a high-resolution display, a neural interface, and transition lenses that turn into sunglasses when going outside. They are available in two colors, black and sand, and can be purchased for $7.99 in stores starting on September 30, where demos are also available.
Q:What is the role of AI in content creation according to the speech?
A:AI is anticipated to revolutionize content creation by enabling people to create entire new immersive and interactive experiences, such as worlds, games, characters, art, and holograms. Currently, creating such content is challenging and time-consuming, but AI is expected to simplify this process in the future, transforming virtual reality and other content platforms.
Q:What new capabilities does Meta Horizon Studio and Engine offer for creators?
A:Meta Horizon Studio and Engine provide creators with new AI tools to generate meshes, textures, TypeScript, audio, skyboxes, and more, significantly speeding up the creation process. The new Meta Horizon engine is optimized for the metaverse, offering faster performance, better graphics, and easier creation of infinite connected spaces with realistic physics and interaction.
Q:What are some of the new experiences being introduced with the new engine?
A:With the new Meta Horizon engine, early access to hyperscale capture allows users to scan a room and create an immersive, true-to-life world in minutes. The new engine also enables the creation of an immersive home and interconnected worlds with fast loading times, support for more people in the same world, and the ability to blend hyperscale worlds into Horizon.
Q:What entertainment content is being added to Horizon TV?
A:Horizon TV is getting a range of entertainment content including movies and TV shows, live sports, and music, with partnerships that include Disney Plus, Hulu, ESPN, and Universal Studios. It will support 3D special effects, Dolby Atmos, and soon, Dolby Vision for an immersive experience.
Q:How does the speaker view the future of storytelling in virtual reality?
A:The speaker believes that the shift towards more immersive storytelling in 3D is an exciting development that will drive a new wave of adoption for virtual reality and Augmented Reality (AR) devices, potentially making it a huge category for both virtual reality headsets and glasses.
Q:What is the speaker's background and passion in filmmaking?
A:The speaker has been passionate about filmmaking for over two decades, aiming to engage and draw people into the story and characters. Their interest in 3D filmmaking began in 1998 with the intention of creating better experiences than what was available at the time.
Q:When and how did the speaker first encounter 3D filmmaking?
A:The speaker was first exposed to 3D filmmaking in 1998 for a Universal ride show with massive film cameras. This experience led them to believe that 3D could be done better and, upon the advent of digital cameras, they were a super early adopter, experimenting with the idea of using two cameras side by side to make 3D films.
Q:What achievements has the speaker accomplished in the field of 3D filmmaking?
A:The speaker has built a great 3D team and has succeeded in creating 3D films. They have made the 3D cameras available to other filmmakers for concert films, sports, and big movies, including working with Ridley Scott.
Q:How does the speaker perceive the future of 3D content distribution?
A:The speaker envisions a future with a new distribution model where there can be theater-grade 3D content delivered directly to viewers' heads, thanks to advancements in technology like the Quest series.
Q:What was the speaker's reaction to viewing their content on the Quest 3?
A:The speaker was surprised and pleased with the viewing experience on the Quest 3, describing it as how they thought movies should be seen. The speaker appreciated the brightness and dynamic range, which they found more engaging than theater projection standards.
Q:What is the speaker's vision for the impact of 3D content on emotional engagement and presence?
A:The speaker believes that 3D content can lead to more emotional engagement and a stronger sense of presence. They estimate that 3D can increase engagement by about 20% and are excited about the potential for a 'stereo ubiquity' future where all content, including news and entertainment, is experienced in 3D.
Q:What is the speaker's role in promoting 3D content creation for film and television?
A:The speaker is working to partner with other filmmakers and showrunners to create a pipeline for 3D content. They are building cameras, systems, networking tools, and offering them at a small fee to other filmmakers and broadcasters to facilitate the creation of a large demand for 3D content.
Q:How is the speaker contributing to lowering the costs of producing 3D content?
A:The speaker is driving down the costs over time to build more conventional production methods for 3D content, making it more accessible for any kind of production to deliver a richer experience to their audience.
Q:What is the goal of the partnership mentioned in the speech?
A:The goal of the partnership is to create software and digital solutions that make the production of high-quality stereo content easier and more accessible for anyone to use, anywhere. The partnership aims to streamline the decision-making process around what makes good stereo and reduce eye strain, ultimately enabling the production of tens of thousands of hours of content yearly.
Q:When did the speaker first conceive the idea for the Avatar project?
A:The speaker first conceived the idea for the Avatar project in 1995, during a dream about a bioluminescent forest. The development of the project has been a continuous process for over 30 years.
Q:What does the speaker compare the current state of technology and content creation to?
A:The speaker compares the current state of technology and content creation to the early 90s and late 80s when Computer Graphics (CG) technology was first emerging. The speaker recalls skepticism about digital actors and the belief in the superiority of analog solutions, which led to the founding of the company Digital Domain. The speaker observes parallels between the then-revolutionary technology and the current advancements, suggesting a progression towards similar levels of adoption and ubiquity.
Q:What role did the speaker have in the early adoption of digital technology in movie theaters?
A:The speaker played a role in the early adoption of digital technology in movie theaters by working with the team at Texas Instruments to embed the ability to carry two image streams into their servers and electronics. This innovation allowed digital projection to be more widely adopted, leading to the eventual transition from analog to digital movie projection in almost all theaters, with the exception of some art houses.
Q:What is the significance of the upcoming film mentioned by the speaker?
A:The significance of the upcoming film mentioned by the speaker is that it represents a new era of Avatar content, with the film 'Avatar: Fire and Ash' set to hit theaters on December 19. The film will also feature exclusive 3D content available on Meta Quest devices through Horizon TV, signifying a milestone in the expansion of the Avatar franchise.
Q:What future developments in technology and content does the speaker anticipate?
A:The speaker anticipates future developments in technology and content to be similar to past revolutions, suggesting that the process of creating and adopting new technologies, such as 3D and digital movie projection, will continue. The speaker also hints at the excitement of upcoming projects like 'Avatar: Fire and Ash' and the potential for technology like Meta Horizon Studio and Engine, as well as the Fall 2025 line of glasses, to change how people experience content.

Meta Platforms, Inc.
Follow





