Content and Commerce with Adobe Experience Manager as a Cloud Service

This session helps developers to get started with Commerce on Adobe Experience Manager as a Cloud Service - from the local setup of Adobe Experience Manager SDK + Commerce Integration Framework to Adobe Experience Manager & Magento in the cloud.

Continue the conversation in Experience League Communities.

Transcript
Welcome everybody to this developer live talk about content and commerce with Adobe Experience Manager in the cloud. My name is Markus Hark and I’m working for the AAM Commerce team. And actually we had a team which built these integration, what we call Commerce Integration Framework, which will be the majority of this talk today. And then we do a deep dive into the AAM Commerce on the Experience Manager. So the menu for today, basically I will give you a quick intro into AAM content and commerce and what we do actually mean by AAM Commerce. Then we’ll do a deep dive into the technical topics, which will start with AAM Commerce on a cloud service. I will show you how we actually get onboarded, get your project on AAM cloud service started. And once you have the project started, obviously the next logical step is you want to do also developing for this project. So you need to set up your local development environment and I will show you this as well. Then let’s jump into the topics. All right, so AAM content experience manager, content and commerce. What do we actually mean with that? It’s actually quite simple. We mean the box here in the middle. I will start with the left, which is AAM experience manager, the triple asset management and content solution we all know. On the right, we have our commerce engine. We from Adobe obviously would prefer it is Magento or Adobe Commerce. However, the integration framework is agnostic and works with others, non Adobe third party solutions as well. And in the middle of the piece, which basically glues these both boxes together, this is what we call the commerce integration framework. The idea behind this is that we provide the tool for you out of the box, which allows you to integrate these solutions and providing on the one hand, all the components, the building blocks you need for building out your storefront experience with AAM. And on the other hand, we provide the tooling for your business users and your marketers as part of the AAM admin, which allow them to build the immersive and nice AAM commerce experience. So we provide consoles, pickers, et cetera. I come to this in a second for this. And this is basically the focus for this talk, the box in the middle. So what’s actually in that box. If you open the tool set, we provide you with three basically modules or three sort of building blocks. First is what we call the Adobe AAM revenue, which is our reference tool. It’s a B2C online store example, which you can take for a project as a reference and start your project from there. This project is built using the commerce core components, which is a set of core components. And if you are familiar with the site’s core components, this is basically nothing new for you, except they are built for the commerce use cases. And in the middle here, as a teaser, we have the commerce offering tools, which is a set of tools we provide as part of the SIF add-on and integration framework for your business users, which allow them to build commerce experiences, work for the product catalog as part of the AAM admin offering. So one step by the other. So the reference storefront, as I said, is basically a storefront project, which you can use for your personal project as an accelerator, as a starting point, as a reference to just look in or easily simply to just do demos. It’s built on top of a B2C use case. So it has everything that typically B2C store has. Browsing the shop, navigating the product, the shopping cart, mini cart, user log in and out, my account, et cetera, et cetera. And it’s a full functional working e-commerce store, which you can use in a smaller project. You can really use it one-to-one, customize it and go live with it. It’s built with using the core components and everything is public open source available on GitHub. Speaking about the core components, this is the second building block. The core components is a set of commerce related core components. They are part of the AAM core component library. So if you’ve seen the core component library, basically if you scroll down in the navigation, there’s an entire section about commerce and this has all the core components we provide. They are built in a way that they are reusable, that you can build your custom components on top of them by either not customizing them at all or with simple customization. Customization is a good topic here. Actually, we have to talk to more by my colleague, Mark Becker, and he will focus and show you how you can use these components in technical detail and actually how you can really customize them from simple customization and just doing a little bit of styling to more advanced customizations like getting additional commerce data from your commerce backend, optimizing the GraphQL queries, extending the GraphQL queries, et cetera, and even implementing own custom GraphQL queries at all. The components itself, they are built from out of the box for internalization. So it’s already built in. And also the data layer, what actually was shown in the session before is also built in already. So it’s all available out of the box. And the third box, the offering components. What we provide as part of CAF for cloud is a set of tools and a deep integration with the AM admin and the AM offering, which means we provide a backend commerce console where your business users can quickly navigate the product catalog, view all the products in the catalog. It’s indicative to AM backend offering search. So the AM unit search, omni search basically allows you to search for product data as well. And if you just type for a certain term, it will find everything, products, pages, assets, whatever is in your repository. And it’s indicative to AM product page editor and the experience page fragments editor, which means you can easily just drag and drop components onto an experience fragment or onto a page. And it’s easy to build experience based using these components you have built using CAF. That’s for the quick introduction, switching gears to the cloud service now. And let me start with the architecture explaining how the architecture looks like. I will start with Magento on the first slide and have another slide, which is slightly different than this one for non-Magento third party solutions like Hypers, for example. So this picture shows the data flow. User enters your website. The user enters the website using his browser. That’s basically the first step. The browser accesses the AM URL and the AM URL is requesting to deliver an experience. If that was rendered before, it can be delivered by this patch or by a CDN like any other AM page, nothing special here. If it’s not cached, then the request goes to the AM publish. On the AM publish, there are components which require a common data. This common data is loaded from the common solution, Magento in this case via a GraphQL query. So we do a GraphQL call to Magento. The query depends on the component itself and it’s customizable. Magento returns as the graph. We get the data out of the rendered experience. So nothing really special here. Another use case in data flow is we also have client side components. Client side components means JavaScript components. The way how we distinguish here is basically for everything which is more static from data nature like a product catalog or a product PDP, we try to build them by using server side components so they can be rendered by the AM server and we can easily cache them. For data which is more dynamic and or personalized, we use typically client side components, which means the shopping cart, for example, or the entire my account pages use client side components. Here data flow is basically directly from the browser executing GraphQL queries against the Magento backend. However, the request can be rooted via CDN and AM dispatcher. We provide a simple reverse proxy here. For use cases where you don’t want to expose the backend URL or simply also for a cause reason that you have not to fiddle around with multiple endpoints and multiple URLs in the browser. One type of request is a little bit special and that’s the AM offer because the AM offer actually is able to see additional backend data from the Magento backend because the author can do requests and retrieve staged and not live data. For use cases, simply you prepare a new product launch, which is maybe in the next month and you want to already prepare your product experience for this launch. The author can already see this product, even they are not on the public website yet, can prepare your marketing pages. You can customize PDP with extra marketing content using experience for example, or like blog postings using this product data already, but you cannot put them live. You can only put them live if the product is actually set to published on the backend as well. This is how the data flow looks like for Magento. If it’s a non-Magento e-commerce solution like SAP Hybris or Salesforce Commerce Cloud, for example, we add one extra puzzle piece here and this is related to the backend. On the AM side of things and on the CIF part and the CIF components, nothing will change. It’s exactly the same components, exactly the same OGI bundles and components we deploy to AM. For the backend, however, we need this sort of API translation because the third part in commerce has a different own proprietary API, which is not really compatible with our GraphQL queries. So we need to do some mapping. The solution we use here for mapping is the Adobe IOO One Time Platform, which is Adobe serverless platform. And we implement basically a set of runtime functions within JavaScript, which allows you to do the API translation and the object mapping between our API and our data format and the back office, the back commerce backend solution. So the data flow here on the AM side will basically look the same. On the backend side, it does slightly different. So first the AM or the browser it has comes in now to the GraphQL endpoint on IOO One Time. And actually one thing to add here, it doesn’t necessarily need to be a one time. If you have a custom platform and you want to implement these functions, for example, for your home grown in-house e-commerce solution, which is not available off the shelf, you can also use an alternative platform. So we have customers using MuleSoft or AWS Lambda for this, for example. The general concept is still the same. So the GraphQL query and the request comes in. We provide a set of dispatcher actions. They figure out how to basically decompose the GraphQL query and map these into the backend request. So the GraphQL resolver will invoke a function, which is doing the work, calling the backend commerce system, waiting until the result is coming back. Once the result is coming back, it’s transferring and mapping the data object from the commerce API to our common GraphQL schema. There might be multiple requests, also multiple requests in parallel. So once these are all finished, the GraphQL object hierarchy basically gets assembled and then sent back to the client. As I said, the rest of the data flow on the AM side is basically the same like before. This is for the high level overview of the entire architecture. One thing I would like to deep dive now a little bit is into the CIF components and how these components are working. So this is a little bit high level architecture of the components. And if you use site’s core components as well, as I said already, our components were very similar. We follow actually the same concepts, using the same patterns like the site’s core components already do since quite a lot of years, except the use case is different. And since our use case requires commerce data, we need to do backend calls to the commerce system. So we introduced two new building blocks here for the CIF components. The low level part is a GraphQL client. We provide a GraphQL client, which is basically a very generic boilerplate post GI bundle, which is added on top of the HTTP request to give you all the nice tooling to do GraphQL queries. And you can do post request, get requests, et cetera. And the GraphQL client is able to transform also your raw low level JSON strings, which gets from the backend into Java objects, which is nice and handy because as a Java developer, you typically don’t want to fiddle around with large strings, which contain your query and then navigating large JSON object trees. You want to work with Java models. For this, we provide already a pre-generated set of Java data models. Actually we are not writing them by hand. We’re just taking the Magento GraphQL model itself and the schema and generating these models out of the schema. This is the second bundle we provide. And if you use a standard Magento deployment, this will work out of the box. If you have customized your Magento data model, or you have a set party solution with extra classes here as well, this is easily possible to generate your own model classes and basically put them next to it as well. So you can work with both. You don’t need to regenerate the entire object tree because the standard classes can stay the same. And for your customization and extensions, you put your own classes next to it. And these core components are then used similar to the size components by your project. Either you have simple proxy components, which don’t add extra logic. You can put your styling extensions, your own HTML script, etc. there. And obviously you have always the option to put the entire custom component, which also can use data modeling in the GraphQL client. For the client side part, these are components a little bit next to it. They also part of the project. We deliver a CIF core components. However, these are clients and components they are building using React and JavaScript. And we share a lot of logic and code here with the Magento team, which they use for the Magento PWA to not duplicate a lot of code and a lot of effort. These components are also part of the project and same patterns apply to the sites core components, the classic server core components as well. So you can style them as your needs and you also can extend them. That’s on the architecture. And now let’s deep dive into the cloud service part actually. So how do we get started with AEM as a cloud service? Actually what you need to start a project for commerce here on the cloud, it’s basically three relatively simple steps. And if you used AEM cloud before, it’s not that much new. So first you need to be onboarded to commerce, then you need to connect to AEM environment through the commerce solution. And then like for any other project, you need to deploy a project by a cloud manager. So let’s go step by step. First step, the onboarding is actually two step flow. The first step is the entitlement and deprovisioning. The entitlement is based on AEM sites. So you need to be customer of AEM sites and AEM commerce, basically an add on to AEM sites. So you need to have an AEM site license and entitlement already or get both in the same time together. The entitlement is done by the Adobe team. For customers easiest would be reaching out to your sales representative and for partners to your partner manager, they can help you here. Adobe is doing then the entitlement and once this is done deprovisioning can start and the SIF add on, which is basically the product part of the SIF for your project will be deployed on the next time you run a pipeline. This will happen automatically. You don’t need to do anything. Once you have the entitlement, second step, which needs to be done, you need to connect your environments. The first time we do this, basically Adobe is helping you here and Adobe is doing this step. If you provide the end point of the commerce system, Adobe can do it. However, this is actually a self-service part and you can do it either by your own. You can change it later by your own as your needs. For example, if you spin up a few development environments and you remove them and add new development environments later, it’s always easy to configure them to the right magenta end point of the commerce system. One thing to note here is the entitlement itself is pair cloud manager and EEM cloud program, which means once you have the entitlement for the program, all the environments in this program, meaning all the developer, set boxes, the stage and the program can automatically get the SIP add-on. There’s no need to do this in each environment. And if you create a new development environment, it will automatically get the add-on as well as long as it’s part of this program. To do the self-service part, this is basically quite an easy step because it’s actually only one mandatory configuration which is needed, which is connecting EEM to your backend system. And this is setting a cloud manager variable, which is basically an environment variable, which is automatically injected into your EEM environments. The concept is generic and other solutions use it as well. You maybe even have used it before. And for commerce, we just require you to set one variable, which is called commerce endpoint. This is pointing to your GraphQL solution. This can be a magenta GraphQL. This can be an endpoint as I showed before, an IO one time or whatever custom GraphQL and what you have for your commerce solution. As long as it’s mapping our GraphQL schema, it will work. Actually for Magento, there is a second optional configuration, which is providing an API token. It’s needed for the EEM offer to access the preview and the stage data because the preview data is typically not publicly easily accessible. It’s always protected and you need to have extra rights on the backend system to see these staged and non-published data. Once this is done, the next step is running a pipeline and running a pipeline is basically you provide a project code as part of your EEM cloud service project. Once you push this to do Git repo, the pipeline basically will start and will start with step one, which is custom built. The Maven build is building a project and assuming if Maven build succeeds, the pipeline will move to the next step, which is building the EEM basic image. The new part here is that on top of the EEM base release image, we will put the add-on. This is done automatically before and this is done before the custom code is deployed. The custom code can assume the add-on is always there and you can have dependencies to the add-on code if needed. We do this as well because we share a code between the offering and your project code and you can use these dependencies already in your custom code because you can assume it’s always deployed automatically and before your code gets deployed. The process of the add-on is using a Sling feature archive. It’s not custom to commerce. It’s basically all the Adobe add-ons provided on EEM cloud using the same pattern here. Once this image is built, it can be deployed in step two to your stationed development environments and assuming you are happy with the result, everything works. You can push this forward doing additional tests, doing the performance audit and everything is fine. It can be deployed to production. The pipeline, if you’ve worked with EEM cloud before and cloud manager is not really special, especially the second part here of the offering tools, this is totally transparent to you. In the case there are any issues, for example, it’s easy to get the cloud manager log and any cloud manager log. Once you push the project, you can see the step where the SIF part is part of the assembly. If you check the cloud manager build image log, there’s actually a section called feature archive tool and in this there is a section about processing the SIF add-on. Here you will see which part of the add-on, which panels get deployed, which packages are deployed and actually what version exactly is used. This might be handy for debugging purposes, but if everything works nicely, you normally won’t really need this part. And with this, if you have to push the project and the pipeline succeeds, you’re actually ready to use your EEM project on the cloud. Next step would be doing local development and changing these projects. Changing these projects requires you to also have three basic steps. You need your EEM development environment locally, which is the EEM cloud SDK plus the SIF add-on. Both are available on software download and I’ll show you this in a second. Next again, same pattern, you’re connected to your commerce system and then you deploy a project until your local environment. I will do these steps now in a quick demo locally here on my environment to guide you through this. However, the first step, downloading everything, I already did this before. As I said, it’s all available in software download and you need these two packages. The EEM SDK, we always recommend to get the latest and the same applies to the SIF add-on. Always just get the latest here and download everything. Step two would be connecting it to your Magento or third party commerce system. For local development, you can do whatever works best for your use cases. You can use EEM commerce cloud or Magento cloud environment if you already have this and a custom of it. For local development also, just local Magento or virtual environment would work and for the SIF solution basically applies the same. It depends a little bit how you actually build the integration layer. Let me now switch to my consoles and IDE to show you this in detail. I have EEM SDK already up and running here, but I’ll show you what I actually did. I have the second step here and I’m now, let me go to one back. I got the EEM SDK. Actually, I got one last week, so this is not the really newest, but it’s pretty new just a few days old. I already inset everything and moved two copies into my offer and publish. For the demo, I only need the offer, but I have a publish here as well. In the offer, I run the EEM command with the JAR file to unpack the quickstart JAR, which is basically not starting EEM. It’s just unpacking everything, which generates me a CX quickstart slash install folder. In that folder, I put the SIF add-on. The SIF add-on, you get this downloaded as a SIP as well, and you can answer it wherever you want. The only thing you need to be careful here is the SIF file actually contains two .far file, which stands for feature archive. There’s one for the offer and one for the publish. Just make sure you pick the right one for the environment you spin up locally. Once this is copied, you are actually ready from the EEM site. The only thing left over is the connection to Magento. For this, you need to set this environment via this. For the local SDK, it’s as easy as this. It’s just a local operating system environment via this. I exported here my commerce endpoint pointing to my commerce solution. After that, I just started my EEM. What you get there is basically this. This is the EEM starting screen. If you worked with the EEM SDK before, you might already notice one tiny little difference, which is the commerce icon. On the plain SDK, there’s no commerce icon. If the SIF add-on is installed, you already see the commerce icon for the commerce console. This is basically the first check. If you did everything right, then you should see the commerce icon. The second step is if you click the commerce icon, you should see your catalog. Without any additional configuration, I can already navigate my product catalog. That’s why we provide a tiny little default configuration which binds the Magento or your commerce system with the default catalog. If you have a default catalog, typically Magento already provides this out of the box, the binding will be there and it will be working. You can already navigate the product catalog, navigate the category tree, see the products. Once the background is loading, hopefully you can see the product details for all the products. That’s already done and that’s part of the offering tool. Also the integrated search, et cetera, already will work. However, if we go to the sites console, we have no project deployed. It’s empty because I didn’t deploy my project. That’s what I will do in the next step. For this one, I will use the Venya reference tool. If you already deployed something on AM cloud, you typically would check out your cloud manager Git repo and use that project. If you start from scratch, you have basically two options. You can use the AM archetype and bootstrap an empty project. The AM archetype recently got extended to include CIF as well. So there is an option to include comments, which will include you all with the dependencies you need for CIF and it will include you all the proxy components. I now install the Venya project, which usually takes round about one minute and I will use this minute to show you a little bit of project structure. So switch to my IDE. This is the same project here. If you, and as you see here and you’re familiar with it, it’s a classic AM project layout. It has a UI apps, UI content, all the client side components are in the UI store for the satellite in the UI apps project. The project name is called Venya. So we have our components, all the sites for components, and there’s one additional folder, which is called commerce. And then within this, we have the proxy components for all our CIF related components. So everything is already set up for me. The same you will get if you deploy the AM project using the AM market type. Now let’s switch to my console and see if it’s already deployed. It’s still taking a few seconds. Then let’s wait until this is ready and then we can switch back to AM and see already the project working. Okay. This hopefully is now done. All right. This is almost the end. This was the last package to be deployed. I think we can ignore the tests here and already go to AM. So if I reload my site console, I already see my project deployed. So I have now a very simple project. This is the Venya store in a very light vision. And let me just open the homepage for a sec of the demo. I won’t go into the detail for the project because you can just get it from GitHub and Play with this on. And the homepage should render quickly. And if I switch to preview, I’m already in preview mode. As you can see, I can already navigate the site. So the categories from my backend, they are imported automatically. I do not need to do anything extra special here. It’s already connected with a preview config. As I said, if you use default store, I can navigate my catalog now and my storefront and already preview my website. Take a few products, add them to the cart. And I could actually go to the entire checkout, which are the steps I will basically skip now for this demo. Switching back to my slides. So what we have seen now, I installed the data. It’s already here. And if you get the slides later, all the steps are in here for convenience. I showed you the catalog and I deployed the project. That’s it so far for this short talk. I will leave you with a few resources. And with that, we are in the end and maybe we have one, two minutes more for some Q&A. Otherwise, thank you very much. There’s one question. Just go pick the last one from Paul. Does it need any activation to make the product live or can it be made live directly with Magento? You have both options. The defaults and now you will be, if you make it live in Magento, it will immediately show up on the AM website. However, we support also combined workflows where you can do, as I said, you can also do the preview on the AM offer, maybe create even an AM launch for this new product. And then you set both live at the same time. But the simple use case, you just add it to your catalog, set it to live and it will be available on the website. Okay. I think we are up to time. Thank you very much again.

Click here for the session slides.

recommendation-more-help
3c5a5de1-aef4-4536-8764-ec20371a5186