Video Screencast Help

Timeout on Largest Files report

Created: 18 Sep 2008 • Updated: 22 May 2010 | 4 comments
This issue has been solved. See solution.

I try to run the Server Report "Largest Files" from the NS Console but get the error:

Sql error in query. Error: System.Data.SqlClient.SqlError: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.Sql CommandText: -- MAX ROWCOUNT SET ROWCOUNT 50000

Does anyone know what I can do to get this report to run succesfully? It's similar to what is described here: AKB 36049. Unfortunately it didn't work.

I turned down the resource history retention level to 2 months and have purged the DB. Still no luck. Any guidance would be appreciated.

Comments 4 CommentsJump to latest comment

TGiles's picture

Vake,



If you are currently running Recovery Solution 6.2 SP2 you will want to update the report's actual stored procedure. The knowledge base article below contains the update. A logic error was found in the report that caused the server storage space to be reported incorrectly on files that have multiple revisions.



https://kb.altiris.com/a.........icle=43450&p=1



Also since the error is saying the report is timing out you can run the SQL from Query Analyzer. Depending on how many rows and how long it takes to execute you might have to increase the file size in the report. This is the actual SQL from the report.



SELECT

CASE

WHEN s.[NSWrkstaGUID] = '' THEN NULL

ELSE CAST(s.[NSWrkstaGUID] AS uniqueidentifier)

END,

s.[Name],

CAST (v.[VolumeRootPathName] AS NVARCHAR(5)) AS 'Partition',

fbh.[FileKey],

MAX(fbh.[RevisionNumber]) AS 'Revision',

CAST (MAX(fbh.[CurrentSize])/1048576 AS NUMERIC(15,2)) AS 'MB',

CAST((CAST(ISNULL(TMP.[size], 0) AS NUMERIC(15,2)) + SUM(CAST(ISNULL(cbi.[BlockLength], 0) AS NUMERIC(15,2))))/1048576 AS NUMERIC(15,2)) AS 'Total Server Space',

fse.[Name] AS 'File'

FROM

FileBackupHistory fbh

LEFT JOIN ClusterBlockInfo cbi

ON fbh.[FileKey] = cbi.[FileKey] AND

fbh.[RevisionNumber] = cbi.[RevisionCounter]

JOIN FileSystemEntry fse

ON fbh.[FileKey] = fse.[FileKey] AND

fbh.[VolumeKey] = fse.[VolumeKey]

JOIN Volume v

ON fbh.[VolumeKey] = v.[VolumeKey]

JOIN SBS s

ON v.[SBSKey] = s.[SBSKey]

LEFT OUTER JOIN

(

SELECT

mcri.[FileKey] AS 'key',

SUM(CAST(ISNULL(mcri.[RevisionSize], 0) AS NUMERIC(15,2))) AS 'size'

FROM

MegaClusterRevInfo mcri

GROUP BY

mcri.[FileKey]

) AS TMP

ON TMP.[key] = fse.[FileKey]

WHERE

fbh.[CurrentSize] >= ### * 1048576 AND --Need to replace ### with the whole number you are wanting to run the report for.

fbh.[FileBackupFlags] <> 2

GROUP BY

fbh.[FileKey], s.[Name], v.[VolumeRootPathName], fse.[Name], s.[NSWrkstaGUID], TMP.size



HTH



SOLUTION
KSchroeder's picture

vake,

Try increasing the "Command Timeout" value at Configuration > Server Settings > Notification Server Settings > Database Settings. I think it defaults to 300 (5 minutes). Try increasing it to 600 or 900 or some other multiple of 60 seconds.



The other option is to run the Reports_LargestFiles report manually in Query Analyzer or SQL Studio, but it wants some temp table name and I'm not quite sure (even after looking at the report code) how to give it what it wants.



The key thing to understand here is the report isn't really timing out on the NS, it is running too long on your RS server. So if you have a huge RS database and/or underpowered SQL for it, then that is the problem. Increasing the COmmand TImeout may help.

Thanks,
Kyle
Symantec Trusted Advisor

For Forum threads, please click "Mark as Solution" if answered.
For all content, please give a thumbs up if you agree with or support the post.

vake_dhingra's picture

Running the Query you provideded on the SQL server worked. Thanks Tylor.



Having some trouble with the KB article about you provided though. Keep getting SQL errors. Should I be copying and pasting that entire query and running it?

TGiles's picture

The SQL file from the knowledge base article is designed to over write the existing stored procedure in your AeXRSDatabase.



All you should have to do is open the file in Query Analyser and make sure that you run it against your AeXRSDatabase. If that doesn't work please post what error you are getting.