Skip to main content
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
bosho
New Contributor II

sql command in SparkR

Hi,

 

I'm following along with the SparkR demo for Fabric notebooks that's on the microsoft website here:
https://learn.microsoft.com/en-us/fabric/data-science/r-use-sparkr

However, when I try to run any of the sql commands in a cell, instead of actually processing the command my notebook seems to just print out the SQL statement. Any ideas what I might be doing wrong?

Cheers,


1 ACCEPTED SOLUTION
bosho
New Contributor II

Have since seen it's beecause I had loaded tidyverse in that notebook to work on some R stuff, and dplyr's sql() command seems to mask SparkR's. Think that's why it wasn't executing it on my spark dataframe.

View solution in original post

6 REPLIES 6
HimanshuS-msft
Contributor III

Hello @bosho ,

Can you please share the test code which you are running ?

Thanks 
Himanshu

govindarajan_d
Contributor III

Hi @bosho,

 

Did you use the SQL command like this:

sqlquery <- sql("SELECT * FROM TABLE")
Anonymous
Not applicable

Hi @bosho ,

Thanks for using Fabric Community.
I cannot find issue while I am executing the below code -

vgchennamsft_0-1709290897676.png


Can you please share your test code so I can guide you better?

bosho
New Contributor II

Thanks for the quick responses. Yes trying it again today I don't seem to be able to replicate the issue, and I see what you're seeing. Previously it was just printing the SQL code back but today I'm getting what you're getting. 

Anonymous
Not applicable

Hi @bosho ,

Glad to know that you no longer see the issue. Please continue using Fabric Community for your further queries.

bosho
New Contributor II

Have since seen it's beecause I had loaded tidyverse in that notebook to work on some R stuff, and dplyr's sql() command seems to mask SparkR's. Think that's why it wasn't executing it on my spark dataframe.

Helpful resources

Announcements
Users online (27)