Skip to main content
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
dpombal
Contributor III

Best ways to write a Pandas Dataframe on Fabric SQL Database

Hi all,

I am reading a REST API using Python and converting data in Pandas dataframe.

Now I am looking for the best fabric Item to store data and use SQL databases.

 

I do not want to use a lakehouse so I  think Warehouse is not the best option and I am trying preview SQL Database.

 

As there are no built in connectors I am thinking using SQL Alchemy / pyodbc , any recommendations?

 

I am migrating python code which works and stores data in an on premise SQL Server...and I am trying to migrate to a Fabric workspace with full SQL capabilities.

 

Which are best options to STORE dataframes as sql tables,,, and which are the best connection methods... entra id login, user/password,...

 

Regards

2 ACCEPTED SOLUTIONS

Hi @dpombal

 

Connect to Your SQL Database - Microsoft Fabric | Microsoft Learn

 

Connect with Python Notebook

Fabric Python Notebooks (preview) offer the ability to run T-SQL code with the T-SQL magic command. In the following steps, connect to a SQL database in Fabric using the %%tsql magic command:

  1. Create a notebook in your workspace with the language set to Python.

  2. In a cell, use the %%tsql magic command. The cell type automatically changes to T-SQL.

    In the following sample, replace <databasename> with the name of your SQL database in Fabric. The -type parameter should be SQLDatabase.

    PythonCopy
     
    %%tsql -artifact <databasename> -type SQLDatabase

    Then include your T-SQL command. For example, to run a query from a database named Contoso:

    PythonCopy
     
    %%tsql -artifact Contoso -type SQLDatabase
    SELECT * FROM SalesLT.Customer;
  3. You can also bind the results to a dataframe with the -bind argument:

    PythonCopy
     
    %%tsql -artifact Contoso -type SQLDatabase -bind df2

For more possibilities to query your data with T-SQL inside Python Notebooks, see Run T-SQL code in Fabric Python notebooks. To see the full syntax, use the %tsql? command. This command displays the help information for the T-SQL magic command, including the available parameters and their descriptions.

 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution

View solution in original post

BhaveshPatel
Honored Contributor

Hi @dpombal 

 

You can convert pandas dataframe to lakehouse/warehouse using python programming language. 

 

SQL Database = Lakehouse = Fabric Warehouse = Power BI Direct Lake Semantic Model. They all are same. Simple and Sweet.

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.

View solution in original post

8 REPLIES 8
tayloramy
Contributor

Hi @dpombal

 

See this blog post: Connect to your SQL database in Fabric using Python Notebook | Microsoft Fabric Blog | Microsoft Fab...

 

If you add a T-SQL cell to your notebook and add 
โ€œ%%tsql -artifact <databasename> -type SQLDatabaseโ€

 

you should be able to connect to your SQL database. 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution

Yes but how can I save a dataframe from pandas into a SQL database table, my question remains.

Hi @dpombal

 

Connect to Your SQL Database - Microsoft Fabric | Microsoft Learn

 

Connect with Python Notebook

Fabric Python Notebooks (preview) offer the ability to run T-SQL code with the T-SQL magic command. In the following steps, connect to a SQL database in Fabric using the %%tsql magic command:

  1. Create a notebook in your workspace with the language set to Python.

  2. In a cell, use the %%tsql magic command. The cell type automatically changes to T-SQL.

    In the following sample, replace <databasename> with the name of your SQL database in Fabric. The -type parameter should be SQLDatabase.

    PythonCopy
     
    %%tsql -artifact <databasename> -type SQLDatabase

    Then include your T-SQL command. For example, to run a query from a database named Contoso:

    PythonCopy
     
    %%tsql -artifact Contoso -type SQLDatabase
    SELECT * FROM SalesLT.Customer;
  3. You can also bind the results to a dataframe with the -bind argument:

    PythonCopy
     
    %%tsql -artifact Contoso -type SQLDatabase -bind df2

For more possibilities to query your data with T-SQL inside Python Notebooks, see Run T-SQL code in Fabric Python notebooks. To see the full syntax, use the %tsql? command. This command displays the help information for the T-SQL magic command, including the available parameters and their descriptions.

 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution

This option is perfect for writing from a table to a dataframe df2

%%tsql -artifact Contoso -type SQLDatabase -bind df2

 

 

But I am looking for the inverse option writing from a pandas dataframe into a SQL Database table. 

thanks for your help/suggestions

 

I still cannot write in an easy way this is for reading only

BhaveshPatel
Honored Contributor

Hi @dpombal 

 

You can convert pandas dataframe to lakehouse/warehouse using python programming language. 

 

SQL Database = Lakehouse = Fabric Warehouse = Power BI Direct Lake Semantic Model. They all are same. Simple and Sweet.

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.

can you be more specific we are having issues with data types in Spark dataframes, and this ir the reason to avoid using lakehouse or Warehouse...

 

We have a program using pandas dataframes, how to save/store them in sql database in an fast/efficient way without having to convert to other types  

Having problems converting python pandas df to spark so i still want to use pandas with SQL connect to SQL Database using user password or some kind of entra app token

Regards

Helpful resources

Announcements
Users online (9,584)