So I am querying data directly from OMS Log analytics using PowerBI Desktop, and I believe there is an 8MB hard limit on the data returned by the query. The problem I have is that I need to query about 30 000 rows, but hit the 8MB limit around 18 000 rows. Is it possible to break the query up, for example, query1 would return rows 1 - 18 000, query2 would return 18 001 - 28 000 and so on, then I can merge the queries in PowerBI to give me a view of all the data?
Problem is my experience in this field, DAX in particular is quite limited, so I don't know how to specify this in the advanced editor. Any help here would be highly appreciated.
Thanks!
Same Issue. Solved it.
My Need: I have a table in Azure LogAnalytics (LA) that accumulates about ~35K rows per day. I needed to get all rows from LA into PowerBi for analysis.
My Issue: I crafted the KQL query I wanted in the LA Logs Web UX. I then selected the "Export -> Export to PowerBI M query" feature. Pasted it into a BLANK query in PowerBi. Authed. And I noticed a few bizarre behaviors:
1) - Like you said, I was getting a rolling ~35K rows of data, each query would trim just a bit off the first date in my KQL range.
2) - Also, I found that for each day, the query would opportunistically trim off some rows - like it was 'guessing' what data I didn't need to fit into a limit.
3) - No matter what KQL |where TimeGenerated >= ago(xd) clause I wrote, I clearly wasn't going to get back more than the limits it held me to.
My Solution - and it works great. In PowerQuery, i created a new blank table in PowerQuery/M (not a DAX table!). In that table I used DateTimeZone.UtcNow() to start it off with Today's date, then I added a col called [Days Back] and added rows for -1,-2,-3...-7. Then, with some M, I added another col that subtracts Today from Days Back, given me a history of dates..
Now, I have a table from which I can iterate over each Date in history and pass to my KQL query a parm1 for:
| where TimeGeneratedDate == todatetime('"& Date.ToText(TimeGeneratedDateLoop) & "')
As you can see, after I edited my main LA query to use TimeGeneratedDateLoop as a parm, I can now get each full day's amount of records w/o hitting the LA limit. Note, that in my case, no single day breaches the 8MB limit. If yours does, then you can attack this problem with making 12-hour breakdowns, instead of full a day.
Here's my final M-query for the function.:
NOTE: I also removed this line from the pre-generated query:
"prefer"="ai.response-thinning=true"
<- I don't know if it helped, but setting it to false didn't work.