Close Menu
    Latest Post

    Medium’s CEO Details Path to Profitability After $2.6M Monthly Losses

    January 7, 2026

    Meta Acquires Chinese-Founded AI Startup Manus

    January 7, 2026

    Design System Annotations: Why Accessibility is Often Overlooked in Component Design (Part 1)

    January 7, 2026
    Facebook X (Twitter) Instagram
    Trending
    • Medium’s CEO Details Path to Profitability After $2.6M Monthly Losses
    • Meta Acquires Chinese-Founded AI Startup Manus
    • Design System Annotations: Why Accessibility is Often Overlooked in Component Design (Part 1)
    • The Red-Teaming Resistance Leaderboard: Evaluating LLM Safety
    • Automating Your DevOps: Writing Scripts that Save Time and Headaches
    • The Most Overlooked Way to Stop Spam Calls on Android and iPhone
    • SteelSeries Arctis Nova 7P Gen 2 review: a highly versatile headset that’s become a daily driver
    • KrebsOnSecurity.com Marks 16 Years of Cybersecurity Reporting
    Facebook X (Twitter) Instagram Pinterest Vimeo
    NodeTodayNodeToday
    • Home
    • AI
    • Dev
    • Guides
    • Products
    • Security
    • Startups
    • Tech
    • Tools
    NodeTodayNodeToday
    Home»Tools»Enhancing HDR on Instagram for iOS With Dolby Vision
    Tools

    Enhancing HDR on Instagram for iOS With Dolby Vision

    Samuel AlejandroBy Samuel AlejandroJanuary 6, 2026No Comments9 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    src 1736gdj featured
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Image 1

    • Instagram for iOS now supports Dolby Vision and ambient viewing environment (amve) to enhance the video viewing experience.
    • HDR videos created on iPhones contain unique Dolby Vision and amve metadata, which required end-to-end support.
    • Instagram for iOS is the first Meta app to support Dolby Vision video, with further support planned for other Meta apps.

    Every iPhone-produced HDR video encoding includes two additional pieces of metadata that help ensure picture consistency across different displays and viewing conditions:

    • Ambient viewing environment (amve) provides characteristics of the nominal ambient viewing environment for displaying video content. This information allows the final device to adjust video rendering if actual viewing conditions differ from the encoding environment.
    • Dolby Vision enhances color, brightness, and contrast to better match the video to the display’s capabilities.

    While Instagram and Facebook iOS apps have supported high dynamic range (HDR) video since 2022, the initial rollout did not include Dolby Vision or amve delivery and playback. Derived encodings were created with FFmpeg, which historically lacked support for Dolby Vision and amve. This meant that the tooling discarded this metadata, resulting in pictures not fully representing their intended viewing experience, especially noticeable at low screen brightness levels.

    Following user feedback from iOS apps, efforts were made with partners to preserve iOS-produced amve and Dolby Vision metadata end-to-end, significantly enhancing the HDR viewing experience on iOS devices.

    How Meta Processes Video

    Understanding the lifecycle of a video at Meta may be helpful.

    Most videos uploaded through the apps undergo three main stages:

    1. Client Processing

    During client processing, the creator’s device flattens their composition into a single video file suitable for upload. For HDR videos from iOS devices, this involves encoding with HEVC using the Main 10 profile. At this stage, amve and Dolby Vision metadata are produced, added to the encoded bitstream, and uploaded to Meta’s servers.

    2. Server Processing

    In the server processing stage, the transcoding system generates different video versions for various consumers. Since playback occurs across diverse devices with varying capabilities, the video must be produced in an optimal format for each. For HDR uploads, this involves creating an SDR version for non-HDR devices, a VP9 version for most players, and (for popular videos) an AV1 version offering the highest quality at the lowest file size.

    Each version is produced at a different bitrate (file size) to ensure smooth playback for consumers with varying network conditions, preventing long download waits (though lower bitrates mean lower quality). All derived encodings are created with FFmpeg, which historically lacked amve and Dolby Vision support. This was the stage where metadata was previously dropped.

    3. Consumption

    During the consumption stage, the viewer’s device selects the version that will play smoothly (without stalls), decodes it frame by frame, and displays each frame on the screen. On iOS, all HDR playback utilizes Apple’s AVSampleBufferDisplayLayer (AVSBDL), which consumes amve and Dolby Vision metadata alongside each decoded frame.

    How amve Support Was Added

    When amve support was first pursued in 2022, an interesting observation was made. Operating on a decoupled architecture of lower-level components, rather than a typical high-level AVPlayer setup, allowed for inspection of intact video encodings and examination of amve metadata between the decoder and AVSBDL. It was observed that every frame of every video appeared to have identical metadata. This enabled a quick fix by hardcoding these values directly into the player pipeline.

    This situation was not ideal. Despite the value appearing static, there was no enforcement. A new iPhone or iOS version might produce different values, leading to incorrect usage. Furthermore, amve is not a concept on Android, meaning that viewing an Android-produced HDR encoding on an iPhone would result in a technically inaccurate image.

    In 2024, collaboration with the community led to amve support in FFmpeg. Logging was also implemented, confirming the two-year-old assertion that the values remained unchanged. However, if they do change, the system is now properly equipped to handle it.

    Enabling Dolby Vision

    Dolby Vision was not as straightforward as amve to adopt.

    Challenge #1: The existing specification was for metadata carriage within an HEVC bitstream, but HEVC is not delivered.

    iPhone-produced HDR utilizes Dolby Vision profile 8.4, where ‘8’ signifies a profile using HEVC (the video codec), and ‘.4’ denotes cross-compatibility with HLG (the standard for HDR video that players without Dolby Vision support would adhere to).

    To deliver Dolby Vision metadata, it needed to be carried within a codec that is delivered. Fortunately, Dolby developed Profile 10 for Dolby Vision carriage within AV1. Since VP9 does not provide a facility for additional metadata, there is currently no Dolby Vision support for it, but exploring alternate delivery mechanisms is of interest.

    However, Dolby Vision Profiles 10 and 8 were not adequately supported by existing video processing tools, including FFmpeg and Shaka packager. Based on Dolby’s specifications, collaboration with FFmpeg developers led to the full implementation of support for Dolby Vision Profile 8 and Profile 10. Specifically, support was enabled within FFmpeg to transcode HEVC with Profile 8.4 into AV1 with Profile 10.4 using both the libaom and libsvtav1 encoders. Fixes were also made to other parts of the stack, including the dav1d decoder and Shaka packager, to properly support Dolby Vision metadata.

    Challenge #2: Getting Dolby Vision into AVSampleBufferDisplayLayer

    When AVSBDL receives an encoded bitstream in a supported format, such as HEVC from an iPhone camera, Dolby Vision functions automatically. However, buffers decoded independently are fed into the system, as there is a need to decode formats not natively supported by Apple (e.g., AV1 on devices predating the iPhone 15 Pro). Given this setup, independent extraction of Dolby Vision was also necessary.

    Following Dolby’s newly established specification for Profile 10 carriage within an AV1 bitstream, manual extraction of Dolby Vision metadata was implemented, packaged into the format AVSBDL expected, and the system became operational.

    To verify the setup’s functionality, a series of identical Instagram posts were created, both with and without Dolby Vision metadata. Partners at Dolby measured the brightness of each post using a display color analyzer, across varying screen brightness levels.

    They captured the following:

    Image 2Screen brightness settings versus image brightness, with and without Dolby Vision.

    In this chart, the X-axis represents the screen brightness setting and the Y-axis represents the observed image brightness. The results demonstrate that with Dolby Vision metadata present, the brightness of the content much more closely follows the brightness setting of the screen.

    The implementation was successful, but further work was required.

    Testing the Dolby Vision Implementation

    At Meta, new features undergo A/B testing before deployment to ensure expected performance. To A/B test metadata embedded within a video bitstream, an additional version of every video containing the new metadata was produced. This new version was delivered to a randomly distributed test population, while a randomly distributed control population continued to receive the existing experience. At this scale, it can be asserted that roughly equal populations will watch both versions of each video.

    For each version, statistics were collected, including video watch time, load time, connection type used for viewing, and any playback errors encountered. An aggregate analysis then compared the version with metadata to the version without.

    The hypothesis was that if the metadata functioned as expected, videos with the new metadata would receive more watch time. However, an initial test on Instagram Reels in 2024 revealed that, on average, videos with Dolby Vision metadata were watched less than their standard counterparts.

    How could this be possible? Isn’t Dolby Vision supposed to improve the image?

    First A/B Test With Dolby Vision Metadata

    Data indicated that people watched less Dolby Vision video because the videos took too long to load, causing users to move on to the next Reel in their feed.

    A reasonable cause for the longer load times was identified: the new metadata added approximately 100 kbps to every video on average. While seemingly small, encodings are highly optimized for diverse viewing conditions, and every bit is crucial in some situations. A 100-kbps overhead was sufficient to negatively impact engagement at the margins.

    The solution involved a compressed metadata format. The Dolby team provided another specification that would reduce the metadata overhead by a factor of four, to an average of 25 kbps.

    Would this be sufficient? Another test was necessary to determine this, but additional work was required beforehand.

    Support for Dolby Vision metadata compression (and decompression) needed to be implemented in FFmpeg using a bitstream filter. Furthermore, while the uncompressed format could be extracted from the bitstream and handed off to Apple, the compressed format was not natively supported by Apple. Client-side decompression had to be implemented independently.

    Approximately 2000 lines of code later, the system was ready.

    Successful A/B Test

    This time, it was observed that consumers viewing with Dolby Vision metadata spent more time in the app. This is attributed to users spending more time watching HDR videos in lower-light environments, where screens are set to lower brightness levels, and HDR videos with proper metadata are less strenuous on the eyes.

    Due to the tangibly positive outcome of including Dolby Vision metadata, its deployment across Instagram for iOS was justified, making it the first app to leverage Dolby Vision. As of June 2025, all delivered AV1 encodings derived from iPhone-produced HDR include Dolby Vision metadata.

    The Future of Dolby Vision Across Meta

    A final challenge is that Dolby Vision is not widely supported within the web ecosystem across different browsers and displays. Therefore, the difference it makes cannot be accurately demonstrated on this page; users are encouraged to experience it on Instagram on iPhone. Support for Dolby Vision and amve is now integrated into encoding recipes, making it ready for deployment to other platforms, with current efforts focused on extending support to Facebook Reels.

    In collaboration with Dolby, the perceptible problem of HDR metadata preservation has been solved, and collaboration with FFmpeg developers has led to the implementation of its support, making it readily available to the community.

    This marks just the beginning. Expansion of Dolby Vision to other Meta apps and their corresponding operating systems is anticipated.

    Acknowledgements

    Thanks are extended to Haixia Shi, the team at Dolby, and Niklas Haas from FFmpeg for their work supporting this effort.

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleProgrammatically Creating an IDP Solution with Amazon Bedrock Data Automation
    Next Article UK Social Media Campaigners Among Five Denied US Visas
    Samuel Alejandro

    Related Posts

    Tools

    Design System Annotations: Why Accessibility is Often Overlooked in Component Design (Part 1)

    January 7, 2026
    Tools

    Mandatory Reviews for Dependabot Alert Dismissal Now Available

    January 6, 2026
    Tools

    Claude Opus 4.5 Now Generally Available in GitHub Copilot

    January 5, 2026
    Add A Comment
    Leave A Reply Cancel Reply

    Latest Post

    ChatGPT Mobile App Surpasses $3 Billion in Consumer Spending

    December 21, 202512 Views

    Automate Your iPhone’s Always-On Display for Better Battery Life and Privacy

    December 21, 202510 Views

    Creator Tayla Cannon Lands $1.1M Investment for Rebuildr PT Software

    December 21, 20259 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    About

    Welcome to NodeToday, your trusted source for the latest updates in Technology, Artificial Intelligence, and Innovation. We are dedicated to delivering accurate, timely, and insightful content that helps readers stay ahead in a fast-evolving digital world.

    At NodeToday, we cover everything from AI breakthroughs and emerging technologies to product launches, software tools, developer news, and practical guides. Our goal is to simplify complex topics and present them in a clear, engaging, and easy-to-understand way for tech enthusiasts, professionals, and beginners alike.

    Latest Post

    Medium’s CEO Details Path to Profitability After $2.6M Monthly Losses

    January 7, 20260 Views

    Meta Acquires Chinese-Founded AI Startup Manus

    January 7, 20260 Views

    Design System Annotations: Why Accessibility is Often Overlooked in Component Design (Part 1)

    January 7, 20260 Views
    Recent Posts
    • Medium’s CEO Details Path to Profitability After $2.6M Monthly Losses
    • Meta Acquires Chinese-Founded AI Startup Manus
    • Design System Annotations: Why Accessibility is Often Overlooked in Component Design (Part 1)
    • The Red-Teaming Resistance Leaderboard: Evaluating LLM Safety
    • Automating Your DevOps: Writing Scripts that Save Time and Headaches
    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms & Conditions
    • Disclaimer
    • Cookie Policy
    © 2026 NodeToday.

    Type above and press Enter to search. Press Esc to cancel.