1

Now’s the Time to Keep Learning Flow

 9 months ago
source link: https://medium.com/slalom-technology/nows-the-time-to-keep-learning-flow-7594365bc451
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Now’s the Time to Keep Learning Flow

Discover how to use more advanced features and unlock greater potential for automation on the Salesforce platform.

1*Zv88et4CO3UyeAzxqiXpyg.png

Approximately one year ago, Salesforce announced the upcoming retirement of workflow rules and processes as part of the Salesforce automation suite. Flow has been touted as theprimary declarative automation tool on the platform, which has led many professionals across the ecosystem to begin investing time and energy into learning flow.

The response from the Salesforce community has been overwhelming! There are hundreds of resources out there that walk you through how to use Flow and migrate the processes and workflow rules we we know and love to this new tool. However, most of these resources focus on the basics of getting started and, yet, the scalability and power of the Salesforce platform allows us to go well beyond the basics.

I recently had the opportunity to teach a hands-on training (HOT) on Intermediate Flow at the Southeast Dreamin’ conference. By applying a series of real-world use cases, attendees were able to get hands-on with more advanced functionality available in Salesforce Flow.

Here are some highlights from the session that will inspire you to go with Flow.

1. Subflows and error handling

Subflows are a special type of flow specifically designed to be triggered by another flow. Subflows are powerful because they allow you to build functionality you may need in multiple situations just once and then “plug it in” wherever necessary.

The key difference with subflows is that you need to pass information into them to provide context for the interview to execute its actions appropriately. If you’ve ever wondered what the Available for Input and Available for Output checkboxes on variables are, this is where it’s at. Any information you want to pass out of the main flow and into your subflow needs to be in a variable labeled Available for Output. Likewise, you need a variable in your subflow to receive the data passed from the main flow marked as Available for Input. Basically, you’re building a connector for the data between one flow and another.

If you’ve ever wondered how to get started using subflows, a great example is error handling. Error handling refers to the process of executing a specific set of actions when an error occurs instead of letting the automated process fail and displaying a big, scary red error message to your user. For example, a flow could send an email to a designated administrator when an error occurs and present a screen to the user with information about the error. It’s essential to think through where potential breakdown points in your flow might be — generally any time you put a pink data element on the canvas — and how to handle those errors gracefully so your users can feel confident about using the system.

2. Bulk record creation and governor limits

Governor limits are how Salesforce ensures all customers have adequate processing power to run their businesses while sharing a platform. While governor limits aren’t often encountered when working with workflow rules or processes, Flow increases the likelihood of running into these limits with its increased power. This requires Flow builders to consider their architecture more carefully to avoid errors related to these limits. One key way to address this is bulkification with respect to DML statements (creating, updating, or deleting records).

Bulkification is the concept of executing actions on a set of records all at once instead of processing those records individually. We used the scenario of building a set of test data that included both Accounts and related Opportunities in a Sandbox to illustrate the concept of bulkification and how it helps you avoid hitting governor limits. Processes handled bulkification for you automatically when you would specify a change to all child records of the record that triggered your process, but the flexibility available in flows means you, as a designer, must take database changes into account when planning your architecture. By limiting the number of records that a user can create, and by leveraging collections to store new records so we can create these records in mass, we were able to successfully demonstrate these concepts.

3. Loops

Loops often go together with bulkification and allow you to iterate over a set of “something,” whether that’s records, ice cream flavors, or something else, in Flow. Loops require a collection variable, which is how you store multiple items of the same type together. This was part of the data generation exercise during the HOT. After we created the required number of Accounts, we needed to create a set number of Opportunities for each. To do this, we created a collection variable that stored all the new Accounts and then used that collection of Accounts in a loop that allowed us to work with them one by one.

Flow provides us a single variable that represents the current record we are working with from the collection. As we iterated over each Account record, we used the Account’s Id value stored in this variable to associate the Opportunities we created using an Assignment element to the Account. We then practiced bulkification again by adding all Opportunities to another collection variable. After looping through each Account and building the correct number of Opportunities for each, we used a single Create Records element to mass-create all Opportunities at once. While loops can initially be a challenge, real-world use cases like this help solidify the concepts and make them easier to understand.

4. Using custom metadata types in flow

Custom metadata (CMDT) is an incredibly versatile tool that has been used by developers for years, especially for managed packages, but isn’t widely understood by admins. We introduced this concept through a scenario that helped us avoid hard-coding information into a flow. While many are familiar with the reasons for not hard-coding IDs in Flow, it is also inadvisable to hard-code other information that could change with reasonable frequency, such as email addresses.

In our scenario, we built a scheduled path on a record-triggered flow to notify a support manager if a case was not addressed by the support team within a set time frame. Our scenario also included notifying a business development manager when leads are untouched and alerting a sales manager when opportunities become stale. We chose to use custom metadata in this scenario both to introduce this concept and because it is more flexible and scalable than a series of checkboxes or a multi-select picklist on the user record, especially if one individual holds more than one of these roles. Further, it allows notifications to non-Salesforce users should that be a requirement.

The custom metadata type object includes fields for Label (standard), DeveloperName (standard), First Name (custom), Last Name (custom), and Email (custom). The record we created for notifying the Support Manager was given a Label of “Support Manager” and a DeveloperName of “Support_Manager”, and the appropriate contact information was completed. This record was then queried in the flow and the output used in the Send Email action’s “To” field. Once the flow was activated and tested, attendees were able to update the email address in the custom metadata record and change where the email was sent without having to save and deploy a new version of the flow, eliminating both the administrative upkeep of managing multiple flow versions over time and the need to send a flow through a lengthy deployment process for something as simple as an email address change.

5. Reducing get records elements by using collection sorting and filtering

On a multi-tenant platform where all customers must share a finite number of resources, understanding and managing governor limits becomes crucial for every Salesforce professional. Hitting one of these limits can lead to unrecoverable errors, making it essential to build flows that stay within these boundaries.

One of the key ways to avoid hitting limits is to reduce your interactions with the database through the pink data elements: Create Records, Update Records, Get Records, and Delete Records. While we’ve talked about how to reduce these database manipulation language (DML) calls through bulkification, there’s another way we can limit the need for them as well — through Collection Filter and Collection Sort elements.

The Collection Filter element allows you to take an existing collection variable and filter it down further by specifying the criteria for records you want to keep. For example, we used a Get Records element to query for all Opportunities in our Trailhead Playground. From there, we used a series of Collection Filter elements to further segment the data multiple times and in multiple ways. We filtered it into Closed Won Opportunities, Opportunities that were Closed Lost, Open Opportunities, and then Open Opportunities with an amount greater than $250,000. This allowed us to get the total count of Open, Closed Won, Closed Lost, and Open High-Value Opportunities while using only a single database query. Without the Collection Filter, this would have taken four separate queries, or we would have had to loop through every Opportunity record four separate times to evaluate them against the criteria and build our filtered collections.

The Collection Sort allowed us to take the Open High-Value Opportunities collection even further. In our scenario, we wanted to present a list of the top five highest-value opportunities in descending order by amount. While a traditional Get Records element does allow a user to sort the results, it doesn’t allow us to limit the number of records that are retrieved; every record that meets the criteria will be added to the collection, again forcing us to use a loop to go through the records individually and add each to a separate collection one by one until our count threshold is reached. Instead, with Collection Sort, we can not only sort the filtered records by the field(s) we designate but also limit the number of records in that collection without needing any more calls to the database.

Conclusion

While 90 minutes certainly wasn’t enough time to learn all the nuances of these Flow concepts, this session offered attendees at Southeast Dreamin’ the ability to apply flow concepts using real-world scenarios that offered a deeper understanding and greater insights into how to use each feature.

If this article has inspired you to continue learning flow, our Slalom team would be happy to guide you on your journey and help you unlock your true potential with this powerful Salesforce technology.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK