GenAI metadata

Learn how AEM Assets as a Cloud Service uses Adobe GenAI to generate and augment asset metadata to aid in content management and discovery.

Transcript
In this video, we’ll show you how to set up and use AI-generated metadata in Adobe Experience Manager assets to enhance content discovery and improve scalability. Metadata contains details about your digital assets, powering everything from search to personalization. Correctly applied metadata make your assets easier to find, categorize, and recommend. When working with digital assets, manual tagging can be time-consuming, inconsistent, and error-prone. Adobe Experience Manager uses artificial intelligence provided by Adobe to optimize tagging and automatically generate relevant metadata using machine learning, image recognition, and natural language processing. It not only improves the quality of your metadata and saves you time, but also ensures consistency and scalability across large volumes of digital content. Let’s open Experience Manager assets and see how this works. We have an extensive asset library and we’re about to add more images for our new summer marketing campaign. Manual tagging all the new assets will definitely slow us down, so we’ll use generative AI capabilities to apply asset metadata for us. First, navigate to assets. Open your projects folder and click add assets. Let’s drop our assets here. You can use the standard image file formats like PNG, JPEG, or TIFF, as well as PSD, GIF, WEP, and so on. See our documentation for the full list of compatible formats. Once you’re done, click upload. The images will start to be processed. Once this is complete, select one of the newly added assets and open its details. You can see standard metadata fields here, but we’re interested in the AI generated tab. First, we have the title that AI has generated for us. It’s a clear and concise headline that captures the core idea of the asset, making it easy to understand at a glance. This title doesn’t replace the asset title that you specify. If we go to the basic tab and manually type in the title for our asset, this title will be the one shown on the asset card in the browse view. If nothing is provided here, Experience Manager will automatically assign the generated title to the asset. Next, the generated description gives a brief summary of what this asset is about. It’ll help your team find the relevant asset faster by letting the search module analyze the content of the field for relevance. Finally, the generated keywords field contains targeted terms that represent the asset’s main themes and subjects. In near real time, we have the relevant metadata added without any manual input required. If you feel that some relevant tags are missing, you can always add them here. Simply type the keywords into the field and click save. Similarly, you can tweak and customize the generated description and title, and you’re all set. Now, imagine the campaign has started and your content author logs into the person rock climbing. With Semantic Search in Experience Manager assets, you don’t need to use the exact keywords to match the asset metadata like you would with the traditional keyword-based search. It will look for similar words and concepts such as rock climber or hiker using the generated metadata to quickly find the relevant content. And here you can see the image we’ve uploaded in the results. Notice that the generated title is used for the asset. In a few seconds, you get the assets you need with the power of artificial intelligence. So, now you know how to use AI-generated metadata in Experience Manager assets. We hope this will help you effectively manage your digital assets and deliver personalized experiences at scale. Thanks for watching.
recommendation-more-help
a483189e-e5e6-49b5-a6dd-9c16d9dc0519