Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
DennesTorres
Impactful Individual
Impactful Individual

Schema on lakehouse sql endpoint

Hi,

A SQL Endpoint in a lake house allow us to create schemas. The tables from the lakehouse are automatically in the DBO schema, but we can create new schemas.

Is there anyway to include the tables in custom schemas, either using notebooks or the UI ? If not, are we only able to include custom objects, such as views, in custom schemas ?

 

Kind Regards,

 

Dennes

1 ACCEPTED SOLUTION
puneetvijwani
Resolver IV
Resolver IV

@DennesTorres  It seems to be same limitations that we use to have in Synpase analytics -- lake databases 
Reference Text:

Custom SQL objects in lake databases

Lake databases allow creation of custom T-SQL objects, such as schemas, procedures, views, and the inline table-value functions (iTVFs). In order to create custom SQL objects, you MUST create a schema where you will place the objects. Custom SQL objects cannot be placed in dbo schema because it is reserved for the lake tables that are defined in Spark, database designer, or Dataverse.

 

Link :
https://learn.microsoft.com/en-us/azure/synapse-analytics/metadata/database 

View solution in original post

6 REPLIES 6
puneetvijwani
Resolver IV
Resolver IV

@DennesTorres  It seems to be same limitations that we use to have in Synpase analytics -- lake databases 
Reference Text:

Custom SQL objects in lake databases

Lake databases allow creation of custom T-SQL objects, such as schemas, procedures, views, and the inline table-value functions (iTVFs). In order to create custom SQL objects, you MUST create a schema where you will place the objects. Custom SQL objects cannot be placed in dbo schema because it is reserved for the lake tables that are defined in Spark, database designer, or Dataverse.

 

Link :
https://learn.microsoft.com/en-us/azure/synapse-analytics/metadata/database 

Anonymous
Not applicable

@puneetvijwani We can create the custom SQL objects in the dbo schema as well, like views and procedure. We just can't create the function only. 

GeethaT-MSFT
Microsoft Employee
Microsoft Employee

Hi  @DennesTorres Spark doesn't support schemas, and Lakehouse, tables in the SQL endpoint are synchronized from the Spark catalog.  So currently, not all lakehouse tables will appear in the dbo schema

 

Regards

Geetha

 

Schemas are available at the SQL endpoint. I transfered a table from dbo to a custom schema, which sounds like an extended property. Is there a way (or will be) to define the schema while creating the table in fabric? That's benfecial for permissions granted via SQL at schema level instead of each table.

Hi,

 

Could you provide more details about how you managed to transfer a table to a different schema and it you still could access this table on a notebook?

 

Kind Regards,

 

Denned

you can move a table from schema while using the SQL Point and querying via tsql (I did in SSMS). This looks to be nothing more than an extended property to the underlying delta table.

 

However, from the notebook you still querying using spark which doesnt support schema, as mentioned by the support team.

 

alter schema conformed transfer [dbo].[sales_internal]

select count(*) from [conformed].[sales_internal]

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Weeks

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Users online (27)