March 13, 2025

November 28, 2025
All code for this post can be found here: Publishing and Subscribing Salesforce Reports
Working in Marketing Operations for a few years prior to coming into Tabsdata, I have quite a bit of experience working with Salesforce. Because it sits at the intersection of so many different teams: Sales, Marketing, Customer Success, Sales Eng, Finance, etc., one thing you notice is that even though we all share the same database, the ways people use that data diverge massively.
Data Eng prefers to do some sort of ELT into a data warehouse or datalake, while Sales manages everything within Salesforce Reports. It’s not a problem per se, however it gets difficult when data in your BI tool starts diverging from the data that Sales or Marketing has in their Salesforce reports. Then you have to rebuild the salesforce report manually with SOQL and constantly be monitoring if the report filters have changed.
Also, Salesforce is not great at tracking changes to your data on a macro scale. Sure, you can see data value changes on an individual record, but you need to set field history tracking on a limited set of fields; so answering business level questions like “how many mqls did I have in this report 6 days ago” is difficult unless you’re explicitly snapshotting your data every day.
With Tabsdata’s 1.3.0 release, we introduce new connectors for Salesforce reports that allow you to easily Publish Salesforce Report Data. All you need is your salesforce report name and you can easily propagate salesforce report data while automatically snapshotting and versioning the data as well.
In this blog and tutorial, I not only publish data from Salesforce, but I also generate a daily lead count by status and a delta of how these lead counts differ day-to-day.
The pipeline consists of two parts:
I start by registering a publisher that queries the my Salesforce Report and publishes it into a Tabsdata Table called sf_snapshot
I then build a transformer that does two things:
This makes it very easy to view the state of your leads at a specific point in time and pinpoint when large fluctuations in statuses are occuring.

I then subscribe my two tables into Snowflake
With three functions, I was able to build a basic ELT workflow between Salesforce and Snowflake. After I invoked my publisher a few times, I was able to dive into the UI to see my execution history and versioned lead data.
