Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric certified for FREE! Don't miss your chance! Learn more
Hi,
I'm working on a Data Pipeline that loads data into a Dataverse table. I do a row compare to detect changes between loads, so I am only loading rows that have changed.
Is there anyway to hash the concat of rows? At the moment it seems I can only do plain-text and then convert it to Binary. Hashing would help save on space.
Solved! Go to Solution.
Hi @adamlob,
In Dataflow Gen2 / Fabric Data Pipeline (Data Factory), there’s no native “Hash” transformation yet.
You can use Notebooks to do that :
from pyspark.sql.functions import sha2, concat_ws
df_hashed = df.withColumn(
"row_hash",
sha2(concat_ws("|", *df.columns), 256)
)
Then save it back to your Lakehouse table and use that hash for change-detection.
✅ Benefits:
Very fast and scalable,
Produces fixed-length SHA-256 strings (~64 chars),
Easy to use as a comparison key.
Doc :
- https://spark.apache.org/docs/latest/api/sql/index.html#sha2
Hope it can help you !
Best regards,
Antoine
Hi @adamlob,
In Dataflow Gen2 / Fabric Data Pipeline (Data Factory), there’s no native “Hash” transformation yet.
You can use Notebooks to do that :
from pyspark.sql.functions import sha2, concat_ws
df_hashed = df.withColumn(
"row_hash",
sha2(concat_ws("|", *df.columns), 256)
)
Then save it back to your Lakehouse table and use that hash for change-detection.
✅ Benefits:
Very fast and scalable,
Produces fixed-length SHA-256 strings (~64 chars),
Easy to use as a comparison key.
Doc :
- https://spark.apache.org/docs/latest/api/sql/index.html#sha2
Hope it can help you !
Best regards,
Antoine
If you love stickers, then you will definitely want to check out our Community Sticker Challenge!
Check out the January 2026 Fabric update to learn about new features.
| User | Count |
|---|---|
| 17 | |
| 5 | |
| 4 | |
| 3 | |
| 3 |
| User | Count |
|---|---|
| 58 | |
| 25 | |
| 14 | |
| 7 | |
| 7 |