Testing and launch

Workfront Fusion testing should focus on checking input and output data between the connected software platforms and the data transformations performed within a Workfront Fusion scenario. In addition, Workfront Fusion integration testing is conducted to evaluate the compliance of your scenario to meet specific business requirements. Essentially, we do Workfront Fusion testing to make sure that the integration works as expected.

Testing considerations

In this video, you will learn how to:

  • Consistently consider testing
  • Include source and destination systems
  • Iterate through design and testing
  • Create detailed and shareable documentation
  • Test depth, breadth, complexity, and load
Transcript
There are a variety of considerations to keep in mind when testing a scenario. In this training, we’ll consider some of the major concepts and questions to ask, but with experience in practice anyone building integrations will begin to craft their own test cases to go through each time they build a new integration. The first thing to keep at the forefront of your mind is that testing should be a constant part of any integration build or implementation. Effective testing begins with a clear understanding of requirements and how to verify a requirement is met through testing. Testing should not be an afterthought and should not wait until you consider the integration ready to go live or launch. Every stage of building should incorporate testing. Typically, the first testable stage is a prototype that meets some very basic requirements. For example, a prototype of a request queue use case would have a trigger module for request records and an output like a project record. A functional prototype makes it possible for Fusion users to add transformations, complex mappingsو conditional router logic, and other scenario features incrementally. Each new feature or change should be tested to make sure it meets the requirements and also that the entire scenario still functions as expected. Eventually, the scenario should be tested outside of Fusion. For example, if your scenario is triggered by a new case in Salesforceو then testing should be done from within Salesforce. Fusion testing involves both the source and destination systems, not just monitoring what happens in the Fusion scenario. You’ll need to be logging into third-party systems with which you are connecting as part of your testing as well as monitoring what happens in Fusion as the automation is running. As mentioned already, from the very beginning of your integration build, the approach you should take is one of constant iteration in testing. Build the first version of the scenario and then test, iterate, repeat, and retest. This applies not only to your Fusion scenarios but also to what may need to be adjusted in connected systems or business practices if applicable. There are a couple methods to make sure that the testing needed is performed correctly and to keep everyone involved in building the integration from not feeling overwhelmed. To make sure testing is performed correctly, you need to adequately inform and educate the customer or the team performing the testing. You should make sure that a shareable spreadsheet is used for your user acceptance testing. It’s important that this document is shared and allows collaboration. Avoid sending versions of the spreadsheet back and forth as that typically results in mistakes or confusion for all involved. To not feel overwhelmed by how much testing is required, you need to create a comprehensive testing plan to match which could be documented in the same spreadsheet. A test plan should have clearly defined expectations for what is being tested specifically. How do you best do this? First test cases should be discreet infusion centric. They should test specific requirements of your Fusion scenario and not just be general testing. By avoiding general testing, you can identify nuances such as issues with changing variables. For example, a user inputting numbers instead of texting to a custom form field or vice versa. Or maybe a user doesn’t fill out a field as you typically would expect. Ask yourself if changes in the data from your input system will ever change in effect what is expected by your output system. A simple rule, if the data could change, definitely test for it. Now, how do you know what data or how much data to test? Focus on these three principles. First, you need to make sure you are testing the depth of the data. Similar to what we were just talking about, make sure you test all the different fields, values, and pieces of data involved in your integration. For example, for a multi-select field, make sure that if one, two, or 20 options are selected your automation can handle those variations. Next, consider the breadth of data. When testing, are you trying multiple different variations of the data coming into your automation. Ask those what if questions here. What if the project name is changed but everything else remains the same? What if the request is submitted without all of the required data, but then it’s added right after? What if someone puts dates in a different format than what’s expected? And what if the input system sends a corrupted file? Finally, consider complexity and load. What happens if a file sent from another system is much larger than expected? What happens if someone does a bulk update? Will your automation be able to handle that much input? What happens if you exceed limits set by other APIs or the Fusion system itself? You’ll never be able to handle or consider every possible complexity that your automation may face but just think beyond what you would expect is possible. A good reminder from Benjamin Graham, “The most important part of every plan is planning on your plan not going according to plan.” Ultimately, you want to instill a culture of testing on your team or at your organization. Encourage well thought-out, detailed test plans that involve all systems and variations and data. Make sure testing is not something that you wait to do until the very end of your design, but it’s something that happens early and often as you iterate to your final version. Do this and you’re on your way to successfully implementing integrations using Fusion. -

Testing considerations checklist

Your goal in testing is to make sure requirements are followed, that there is no scope creep, and to catch anything any Workfront Fusion user might accidentally do to break things.

Keep these guidelines in mind to ensure your testing is consistent and captures all essential elements.

  • Determine what test data is needed based on requirements. Typically the most effective testing is based on well-defined and documented requirements.
  • Plan and communicate how to generate data needed to test depth, breadth, complexity, and load. Avoid “happy path” only testing. Think of all the ways users may interact with the automations and the wide range of possible data that will be processed.
  • Consider needed input and output data between connected systems. Verify input and output in those systems, not just in Workfront Fusion.

Prioritize testing throughout the entire lifecycle of your Workfront Fusion implementation. When designing, think about how you can test whether the design decisions meet requirements. Think through how unexpected data could result in errors and add relevant error handling as you build. Plan to iterate through workable prototypes as you continually test.

Want to learn more? We recommend the following:

Workfront Fusion documentation

recommendation-more-help
c9fbcf61-6d19-481e-a9ab-f54a0ae0ee8a