Deploy to Staging and Production

The process for deploying and going live begins with development, continues to Staging, and ends with going live in Production. Adobe provides an end-to-end environment solution to ensure consistent configurations. Every environment supports direct URL access to the storefront and Admin and SSH access for CLI commands.

When you are ready to deploy your store, you must complete deployment and testing on the Staging environment before deploying to Production. This section provides in-depth instructions and information on the build and deploy process, migrating data and content, and testing.

TIP
Adobe recommends creating a backup of the environment before deployments.

Also, you can enable Track deployments with New Relic to monitor deployment events and help you analyze performance between deployments.

Starter deployment flow

Adobe recommends creating a staging branch from the master branch to best support your Starter plan development and deployment. Then you have two of your four active environments ready: master for Production and staging for Staging.

For detailed information of the process, see Starter Develop and Deploy Workflow.

Pro deployment flow

Pro comes with a large integration environment with two active branches, a global master branch, Staging, and Production branches. When you create your project, code is ready to branch, develop, and push for building and deploying your site. Although the integration environment can have many branches, Staging and Production have only one branch for each environment.

For detailed information of the process, see Pro Develop and Deploy Workflow.

Deploy code to staging

The Staging environment provides a near-production environment that includes a database, web server, and all services including Fastly and New Relic. You can fully push, merge, and deploy through the Cloud Console or Cloud CLI commands through a terminal application.

Deploy code with the Cloud Console

The Cloud Console provides features to create, manage, and deploy code in Integration, Staging, and Production environments for Starter and Pro plans.

For Pro projects, deploy the integration branch to staging:

  1. Log in to your project.

  2. Select the integration branch.

  3. Select the Merge option to deploy to Staging.

    img-md
    w-150
    Merge
  4. Complete all testing in the Staging environment.

  5. When Staging is ready, select the Merge option to deploy to Production.

For Starter, deploy the development branch to staging:

  1. Log in to your project.

  2. Select the prepared code branch.

  3. Select the Merge option to deploy to Staging.

    img-md
    w-150
    Merge
  4. Complete all testing in the Staging environment.

  5. When Staging is ready, select the Merge option to deploy to Production (master).

Deploy code with the command line

The Cloud CLI provides commands to deploy code. You need SSH and Git access to your project.

Step 1: Deploy and test the integration environment

  1. After logging into the project, check out the integration environment:

    code language-bash
    magento-cloud environment:checkout <environment-ID>
    
  2. Synchronize your local integration environment with the remote environment:

    code language-bash
    magento-cloud environment:synchronize <environment-ID>
    
  3. Create a snapshot of the environment as a backup:

    code language-bash
    magento-cloud snapshot: create -e <environment-ID>
    
  4. Update code in your local branch as needed.

  5. Add, commit, and push changes to the environment.

    code language-bash
    git add -A && git commit -m "Commit message" && git push origin <environment-ID>
    
  6. Complete site testing.

Step 2: Merge changes to Staging and deploy

  1. Check out the Staging environment:

    code language-bash
    magento-cloud environment:checkout <environment-ID>
    
  2. Synchronize your local Staging environment with the remote environment:

    code language-bash
    magento-cloud environment:synchronize <environment-ID>
    
  3. Create a snapshot of the environment as a backup:

    code language-bash
    magento-cloud snapshot: create -e <environment-ID>
    
  4. Merge the integration environment to Staging to deploy:

    code language-bash
    magento-cloud environment:merge <integration-ID>
    
  5. Complete site testing.

Step 3: Deploy to Production

  1. Check out, synchronize, and create a snapshot of your local Production environment.

  2. Merge the Staging environment to Production to deploy:

    code language-bash
    magento-cloud environment:merge <staging-ID>
    
  3. Complete site testing.

Migrate static files

Static files are stored in mounts. There are two methods for migrating files from a source mount location, such as your local environment, to a destination mount location. Both methods use the rsync utility, but Adobe recommends using the magento-cloud CLI for moving files between the local and remote environment. And Adobe recommends using the rsync method when moving files from a remote source to a different remote location.

Migrate files using the CLI

You can use the mount:upload and mount:download CLI commands to migrate files between the local and remote environment. Both commands use the rsync utility, but the CLI commands provide options and prompts tailored to the Adobe Commerce on cloud infrastructure environment. For example, if you use the simple command with no options, the CLI prompts you to select which mount or mounts to upload or download.

magento-cloud mount:download

Sample response:

Enter a number to choose a mount to download from:
  [0] app/etc
  [1] pub/static
  [2] var
  [3] pub/media
  [4] All mounts
 > 3

Target directory: ~/pub/media/

Downloading files from the remote mount pub/media to pub/media

Are you sure you want to continue? [Y/n] Y

To upload files from a local pub/media/ folder to the remote pub/media/ folder for the current environment:

magento-cloud mount:upload --source /path/to/project/pub/media/ --mount pub/media/

Sample response:

Uploading files from pub/media to the remote mount pub/media

Are you sure you want to continue? [Y/n] Y

  building file list ...   done
  ./
  sample-file.jpeg

  sent 8.43K bytes  received 48 bytes  3.39K bytes/sec
  total size is 154.57K  speedup is 18.23

Use the --help option for the mount:upload and mount:download commands to see more options. For example, there is a --delete option to remove extraneous files during the migration.

Migrate files using rsync

Alternatively, you can use the rsync utility to migrate files.

rsync -azvP <source> <destination>

This command uses the following options:

  • a–archive
  • z–compress files during the migration
  • v–verbose
  • P–partial progress

See rsync help.

NOTE
To transfer media from remote-to-remote environments directly, you must enable SSH agent forwarding, see GitHub guidance.

To migrate static files from remote-to-remote environments directly (fast approach):

  1. Use SSH to log in to the source environment. Do not use the magento-cloud CLI. Using the -A option is important because it enables forwarding of the authentication agent connection.

    note tip
    TIP
    To find the SSH access link in your Cloud Console, select the environment and click Access Site.
    code language-bash
    ssh -A <environment_ssh_link@ssh.region.magento.cloud>
    
  2. Use the rsync command to copy the pub/media directory from your source environment to a different remote environment.

    code language-bash
    rsync -azvP pub/media/ <destination_environment_ssh_link@ssh.region.magento.cloud>:pub/media/
    
  3. Log in to the other remote environment to verify the files migrated successfully.

Migrate the database

WARNING
Database import and export operations can take a long time, which can affect site performance and availability. Schedule import and export operations during off-peak hours to prevent slow performance or outages on Production sites.
recommendation-more-help

Prerequisite: A database dump (see Step 3) should include database triggers. For dumping them, confirm you have the TRIGGER privilege.

IMPORTANT
The integration environment database is strictly for development testing and can include data that you do not want to migrate into Staging and Production.

For continuous integration deployments, Adobe does not recommend migrating data from Integration to Staging and Production. You could pass testing data or overwrite important data. Any vital configurations are passed using the configuration file and setup:upgrade command during build and deploy.

Adobe recommends migrating data from Production into Staging to fully test your site and store in a near-production environment with all services and settings.

NOTE
To transfer media from remote-to-remote environments directly you must enable ssh agent forwarding, see GitHub guidance.

Back up the database

It is a best practice to create a backup of the database. The following procedure uses the guidance from Back up the database.

To dump the database:

  1. Use SSH to log in to the remote environment that contains the database to copy.

  2. List the environment relationships and note the database login information.

    code language-bash
    php -r 'print_r(json_decode(base64_decode($_ENV["MAGENTO_CLOUD_RELATIONSHIPS"]))->database);'
    

    For Pro Staging and Production, the name of the database is in the MAGENTO_CLOUD_RELATIONSHIPS variable (typically the same as the application name and username).

  3. Create a backup of the database. To choose a target directory for the DB dump, use the --dump-directory option.

    For Starter environments and Pro integration environments, use main as the name of the database:

    code language-bash
    php vendor/bin/ece-tools db-dump main
    

    Dump options:

    • --dump-directory=<dir>—Choose a target directory for the database dump
    • --remove-definers—Remove DEFINER statements from the database dump
  4. Though the ECE-Tools method is preferred, another method is to create a database dump file using native MySQL in GZIP format.

    code language-bash
    mysqldump -h <database-host> --user=<database-username> --password=<password> --single-transaction --triggers <database-name> | gzip - > /tmp/database.sql.gz
    

    If you have configured two-factor authentication on the target environment, it is better to exclude related 2FA tables to avoid reconfiguring it after database migration:

    code language-bash
    mysqldump -h <database-host> --user=<database-username> --password=<password> --single-transaction --triggers --ignore-table=<database-name>.tfa_user_config --ignore-table=<database-name>.tfa_country_codes <database-name> | gzip - > /tmp/database.sql.gz
    
  5. Type logout to terminate the SSH connection.

Drop and re-create the database

When importing data, you must drop and create a database.

To drop and re-create the database:

  1. Establish an SSH tunnel to the remote environment.

  2. Connect to the database service.

    code language-bash
    mysql --host=127.0.0.1 --user='<database-username>' --pass='<user-password>' --database='<name>' --port='<port>'
    
  3. At the MariaDB [main]> prompt, drop the database.

    For Starter and Pro integration:

    code language-shell
    drop database main;
    

    For Production:

    code language-shell
    drop database <cluster-id>;
    

    For Staging:

    code language-shell
    drop database <cluster-ID_stg>;
    
  4. Re-create the database.

    For Starter and Pro integration:

    code language-shell
    create database main;
    

    Import for Production:

    code language-shell
    zcat <cluster-ID>.sql.gz | sed -e 's/DEFINER[ ]*=[ ]*[^*]*\*/\*/' | mysql -h 127.0.0.1 -p -u <database-username> <database-name>;
    

    Import for Staging:

    code language-shell
    zcat <cluster-ID_stg>.sql.gz | sed -e 's/DEFINER[ ]*=[ ]*[^*]*\*/\*/' | mysql -h 127.0.0.1 -p -u <database-username> <database-name>;
    
05f2f56e-ac5d-4931-8cdb-764e60e16f26