Please enter your search criteria to begin searching

Einstein Analytics - Data Run Improvements

Another release, another improvement in the world of Salesforce. Einstein Analytics is getting better and better! Salesforce Winter ’20 release notes include an interesting announcement where Salesforce improves on some of the core functionality in Einstein Analytics.

Einstein Analytics does not have live data in its datasets, it shows the data as of the last time the datasets are refreshed. We refresh datasets and bring data into Einstein Analytics in general using primarily dataflow, data sync or recipe data run jobs. These jobs can now be scheduled to run every 15, 20, 30 or 60 minutes. For quite a long time the only option was 60 minutes, so we’ve now seen an increase in possible frequency to once every 15 minutes.  Even though Einstein Analytics doesn’t have live data in datasets, 15 minutes refresh frequency allows for great operational use. To this effect, we can expect that frequency increase will alleviate some of needs to query live data using SOQL statements. It reduces expected volume of code and therefore has a positive maintenance impact.

To activate this new feature, you will need to contact Salesforce support.

With sub-hour data run scheduling enabled, we might run into another relevant limit – the number of dataflow runs in a rolling 24-hour period. Luckily, Salesforce continues the recent trend and increases the 24-hour data run limit from 60 to 120. Now, we can run a specific dataflow every 15 minutes and have another 24 runs left within our 120 runs. And as before, data runs that take less than 2 minutes do not count against our 24-hour data run limit.

We can find ourselves in an interesting situation in situations where some of our data runs take longer than 15 minutes to process. This would mean that we would not be able to effectively use the maximum frequency of data runs. If the scheduling service tries to initiate a data run that is already running or in a queue, the currently running job continues, but the new job fails. To prevent this, we need to ensure that the schedule frequency is lower than the time it takes to complete the data run. We can always use Data Manager to see past data runs to gauge how much time they take. This situation is likely going to only involve our most complex dataflows.

Now, Einstein Analytics admins have further motivation to optimise our dataflows to get the most out of this enhancement. Every new release brings more good news for our Einstein Analytics admins and users.

If you want to learn more about Einstein Analytics, please take a look at our classes or contact us for implementation or customised training.


© Copyright Stimulus Consulting 2015