Skip to main content

Announcing backup and restore improvements for large datasets near the size limit

In August 2021, we announced general availability (GA) of backup and restore for large datasets. This Power BI feature closes an important gap to Azure AS. You can backup Power BI datasets on a regular basis to meet the data retention and disaster recovery requirements of your organization. You can also use this feature to migrate enterprise BI workloads from Azure AS to Power BI. Another common scenario is to roll back an existing Power BI dataset to a previous version. Yet, rolling back was not without challenges for large datasets near the max dataset size. But thanks to the latest improvements, you can now even restore a backup file when the dataset size is near the SKU limitation. You no longer need to be concerned that size limits impact restorability.

For example, a P3 capacity has a max dataset size of 100 GB. If you wanted to maximize memory usage, you might have intended to host a dataset with 70 GB on this capacity. However, prior to the recent backup and restore improvements, you would find it challenging to accomplish this goal. While you can minimize the memory requirements for refresh operations through advanced refresh techniques, dataset restore operations would still require at least 50% of the available memory, effectively limiting the max dataset size to less than 50 GB. You could host a 70 GB on a P3 capacity. You could backup the dataset regularly to meet the data retention requirements of your organization. But you could not rollback the dataset to a previous version by using restore because there was not enough memory available to load the backup file. Restore would fail because it exceeded the max available memory per dataset on a P3 capacity: 70 GB for the originally loaded database + 70 GB to restore the backup file > 100 GB for the P3 max dataset size).

Fortunately, thanks to the recent backup and restore improvements, you can now use a new /forceRestore option with the restore command to overcome this limitation. The following screenshot shows a corresponding restore command. When restoring a dataset with the /forceRestore option, Power BI will try its best to perform the restore operation even when the available memory is limited. Power BI might unload the original database, temporarily disconnecting the users, but the restore operation will succeed.

If you forget to include the /forceRestore option when attempting to restore a dataset near the size limit, you might get the following error message to remind you that the /forceRestore option is required in this case:

We cannot restore the dataset backup right now because there is not enough memory to complete this operation. Please use the /forceRestore option to restore the dataset with the existing dataset unloaded and offline.

SourceMicrosoft

Comments

Popular posts from this blog

SSRS INTERVIEW QUESTIONS

Q: What is SSRS? Ø   SSRS or SQL Server Reporting Service is a server-based report generation software systems from Microsoft and is part of Microsoft BI. Ø   It is used for preparing and delivering interactive and variety of reports. Ø   It is administered through an web based interface. Ø   Reporting services utilizes a web service interface for supporting and developing of customized reporting applications. Ø   SSRS lets you create very rich reports (Tabular/Graphical/Interactive) from various datasources with rich data visualization (Charts, Maps, sparklines) Ø   SSRS allows are reports to be exported in various formats (Excel, PDF, word etc) Q: Explain SSRS Architecture? Reporting services architecture comprises of integrated components. It is a multi-tiered, included with application, server and data layers. This architecture is scalable and modular. A single installation can be used across multiple computers. It includes the fo...

Exception deserializing the package "The process cannot access the file because it is being used by another process."

TITLE: Microsoft Visual Studio ------------------------------ Failed to start project ------------------------------ ADDITIONAL INFORMATION: Exception deserializing the package "The process cannot access the file 'E:\SSASCube\HistoricalDataLoad\HistoricalDataLoad\bin\Development\HistoricalDataLoad.ispac' because it is being used by another process.". (Microsoft.DataTransformationServices.VsIntegration) ------------------------------ The process cannot access the file 'E:\SSASCube\HistoricalDataLoad\HistoricalDataLoad\bin\Development\HistoricalDataLoad.ispac' because it is being used by another process. (mscorlib) ------------------------------ BUTTONS: OK ------------------------------ While running SSIS package i got the error “The process cannot access the file ‘*.ispac’ because it is being used by another process”. I tried to close SSDT and run it again but, I still got the same error while compiling. Then, after searching over internet, I got...

Failed to execute the package or element. Build errors were encountered

Error: TITLE: Microsoft Visual Studio ------------------------------ Failed to execute the package or element.   Build errors were encountered. For more information, see the Output window. ------------------------------ BUTTONS: OK ------------------------------   Solution: We tried to close SSDT and run it again but, we still got the same error while running SSIS package. Then, we need to follow bellow solution: Step 1: Go to Task Manager–> Details Tab. Step 2: Locate the process “ DtsDebugHost.exe “. Kill this process. There might be multiple instances of this process. Kill all of them. Step 3: Rerun SSIS package