Recently I have encountered a critical incident with database restoration failure. I decided to restore a database for DR (Backup-Restore) Testing. As a Scary DBA we should do random database restore testing on to Test Server prior any disaster strike on our real Production Server. So that we can ensure our backup is valid and restorable without any issues.
My "Test" Database has memory optimized table and my DB size is 575 GB. This is fairly a new topic for me how to troubleshoot memory optimized table database. A memory optimized table, starting in SQL Server 2014, is simply a table that has two copies, one is active and one durable on disk whether that includes data or just schema only. Since memory is flushed upon restart of SQL Services, SQL Server keeps a physical copy of the table that is recoverable.
Issue:--Restoration of "XYZ" database failed on our Test Server with the following error:--
Msg 41379, Level 16, State 0, Line 0
Restore operation failed for database 'XYZ' due to insufficient memory in the resource pool 'default'. Close other applications to increase the available memory, ensure that both SQL Server memory configuration and resource pool memory limits are correct and try again. See 'http://go.microsoft.com/fwlink/?LinkID=507574' for more information.
Msg 3456, Level 16, State 1, Line 2
Could not redo log record (310751:4331:4), for transaction ID (2:-741982062), on page (1:229325), allocation unit 72057637921488896, database 'XYZ' (database ID 7). Page: LSN = (310735:185062:258), allocation unit = 72057637921488896, type = 1. Log: OpCode = 2, context 2, PrevPageLSN: (310749:76109:7). Restore from a backup of the database, or repair the database.
Expression: db->TransactionMap->ComputeCount() == 0
Process ID: 7392
Restore was successful but deferred transactions remain. These transactions cannot be resolved because there are data that is unavailable. Either use RESTORE to make that data available or drop the filegroups if you never need this data again. Dropping the filegroup results in a defunct filegroup.
RESTORE WITH CONTINUE_AFTER_ERROR was successful but some damage was encountered. Inconsistencies in the database are possible.
RESTORE DATABASE successfully processed 15513188 pages in 1867.005 seconds (64.915 MB/sec).
Msg 3313, Level 16, State 1, Line 2
During redoing of a logged operation in database 'XYZ', an error occurred at log record ID (310751:4331:4). Typically, the specific failure is previously logged as an error in the Windows Event Log service. Restore the database from a full backup, or repair the database.
This is completely a bogus error message i believe strongly which is indicating that restore the database a full backup or repair the database and also some damage was encountered.
Well, Even I ran restore verifyonly command and it returned my backup was valid/good. No doubt....
Let's do more troubleshoot.....while restoring
- Error Log suggested that restoration failed because of insufficient memory to SQL Server
- From metadata information of backup file received by "Restore HeaderOnly" have shown in memory file stream table.
- Memory allocated to SQL Server got fully utilized before restoration was failed.
- On Production checked the memory usage of "memory optimize tables" feature, which came to 70 GB
- Test Server OS memory was 60 GB
Let's find out memory usage table, basically i will look for very interesting thing which is Memory Optimized Table using GUI...
Even Not much clear.. for me. Let's fo a further investigate ....usnig query
Resolution: I have allocated 60 GB+40 GB =100 GB memory to SQL Server and the database restored successfully but i requested SySOps for aditional 5 GB more total 105 GB. The Server, I restore a database to must have enough available memory for the memory-optimized tables in the database backup, otherwise the database will not come online, and will be marked as Suspect mode/Recovery Pending. After all this work, I was finally able to recover the "Test" database on our DBA QA/Test Server
What's your thought? Feel free to comments!!! In the next article i will touch base on how i handled TB over Database in Recovery Pending State....Yes i had this horrible situation on my production box and i spent almost 11 hours, you may encounter this similar issues on your Production env, who knows btw i took this challenge and brought online this culprit db anyway...:)
I********nstall SQL Server and Do Post Configuration just a Signle Click...Because if you don't want to waste your time sitting next to the monitor.... *************
!!Automate Operations for the Development process!! As we know 100% is not possible but again it is possible to automate your daily Server Administrative tasks. We should create a better Sustained Engineering plan to gradually automate some or even more operations or Administration.
Okay...Do not migrate all your operations at once. I would suggest automate some operations and when you think that the operation is stable, then move to the next phase....
I do manage my Server Administration fully automated way and YES without GUI and all i use SQLCMD, PowerShell and Windows Batch file...
Let's share what i do and what you can do in your environment....
- Database creation/clonning, multiple database Refresh/Restore/Backup and database migration these are fully automated using custom Stored Procedure, SQLCMD
- Download SQL .bak file from FTP Server and Restore onto the Databse Server using automated WINSCP command!!!
- Deploy code on multiple databses and multiple servers using SQLCMD
- Logging all database deployment process into a central database table using SQLCMD where it keeps all records like who deployed code, who took the backup and who restore the database and when and so many
- Creat/Alter/Drop SQL Logins and alter SQL Login's password multiple servers just a one click
- Install SQL Server in a silent mode, install latest service pack using CMD, add SQL port using PowerShell and reboot the server using cmd command line and YES ALL THESE ARE WRAPED UP in a single BATCH FILE JUST YOU NEED a CLICK and grabe a coffee!!! CUZ IF DONT LIKE GUI LIKE NEXT NEXT NEXT!!!
- What about Post Configuration??? YES I RUN a PS Script that call 17 SQL Scripts such as Backup job, Index Job, memory settings, Database Mail configuration, Default Alets etc etc .....which will take care your instance ready to go for LIVE Porduction!!!
- Bulk data load into the warehouse databse using BCP CMD utility....
For more details please contact me
There was a requiremnet to create Non-Cluster Index on multiple columns in 200 tables!!! Crazyyyy!!! Isn't it????
!!! Even 1000 (One Thousands) tables i don't care !!!
Can we automate this process dynmically using SQLCMD & T-SQL?
SELECT ' CREATE NONCLUSTERED INDEX [ NIDX_' + SYSCOLUMNS.NAME + '] ON ' + SYSOBJECTS.NAME + .............................GO
---To get the complete code please Contact Me
I had a request where there were more than a thousands dat file to be imported to the SQL Server. First, i convert those dat files to txt files. Here is the following command :--
rename *.* *.txt
I would suggest developer for coding in couple scenario in order to capture actual error while script deployment especially when we execute a query that modifies portion of data and fails. This requires redoing their works to clean it up when this happens. Also, it potentially breaks data integrity that damages the databases. In order to avoid that I always suggest developers to wrap up their SQL code using template like—
---Add SQL Logic here
ERROR_NUMBER() AS ErrorNumber
,ERROR_SEVERITY() AS ErrorSeverity
,ERROR_STATE() AS ErrorState
,ERROR_PROCEDURE() AS ErrorProcedure
,ERROR_LINE() AS ErrorLine
,ERROR_MESSAGE() AS ErrorMessage;
Install Virtualmin, Nginx 1.10, PHP 7, MariaDB 10 on Linux Server
Virtualmin is a powerful and flexible web hosting control panel for Linux and BSD systems. Virtualmin is the cost-effective and comprehensive solution to virtual web hosting management.
If you want to build your web-site and host your mail server please Contact Me
This is where I'll be sharing my thoughts on SQL topics that matter to me. Who knows... I might even share documentaion, videos and links to other interesting stuff.
If I catch your interest, let me hear from you...
My Dear TechyFrndZ,
Welcome to the lunch www.zclouddata.com and imporved web based application and my first blog post High on SQL!!!
I am so excited to share with you MORE of what oderly has to offer! The amount of ideas, how-to's and inspiration whirling around in my head is ready to jump out for critical problem solutions like Microsoft SharePoint Server, SQL Server, Windows Server and Linux Server
I wanted to start by telling you why i created this blog what you will be seeing more of in my High on SQL posts, and my overall experience of cloud based and on-premise SQL Database Administration.
Zahid Ahamed (Sagar)