Space, the final frontier….

We all need space….

Once upon a time I had a 250GB HDD, and it was back then large. Everything I needed was on that disk. But as the files grew, so did my hard got smaller. On that disk, I have lot of pictures and music, maybe some movie and bunch of documents. Suddenly I ran out of disk space…

Quest No.1 – Understand consumption

My first quest was, first to find what consumes so much space.  For that I used a free disk analyzer software called Space Sniffer. It is a standalone exe file.
You can choose a drive, or a path. works also with UNC paths. Then press Start.image

Very intuitive and practical software. It graphically shows the size of every folder or file. This depends on scaling the details. You go a level deeper by clicking on a particular square. From this screen, you can also open, edit and delete files and folders. Best of all, this software is free, and at the end you can create a log file.

Quest No.2 – Duplicity

As the files grew in numbers, there has also been a problem of some laziness. Some files got duplicated. After googleing and trying couple of software, I found one, also free of charge. Fast Duplicate file Finder from MindGems. Easy to install and easy to work with.

Works with data files, audio and picture files.
For test I have crated a three folders, and put various duplicate files in them. First add folder, and setup scan properties. If you think files are similar, then choose similar (but slower method) or find 100% identical. I will leave it at 100%. Press SCAN.

After it scanned you get report like this, sorted by duplicate groups.

Duplicates are automatically checked. So you can probably delete them and save some space.

So, that’s how I got several GB of space free.

Good luck

TSQL – working with folders and output to file

This article shows how to create a folder for backup. The idea is to create folder with date part in its name, and then backup a specific database to that folder.

First we need to create a parameters to save our SQL command in it.
use master
declare @sql1 nvarchar(max)
declare @sql2 nvarchar(max)
declare @var1 varchar(100)
declare @var2 varchar(100)

–So first we create a folder, manually, called BACKUP. In this folder we will save our subfolders and backups.
set @var1 = N’"C:\BACKUP\SQLDB ‘+(select convert(varchar,getdate(),104))+’"’    –(CONVERT turns the date into dd.MM.yyyy format)

–save to @sql1 statement you wish to execute
set @sql1 = N’exec xp_create_subdir ‘+@var1

–next, we have to crate a variable which will contain path and name of a backup file.
set @var2 = N’C:\BACKUP\SQLDB ‘+(select convert(varchar,getdate(),104))+’\Database_’+convert(varchar,getdate(),112)+’.bkp’

–now, create @sql2 statement you wish to execute
set @sql2 =N’BACKUP DATABASE Database TO DISK = ‘ + QUOTENAME( @var2 , ”” )+’ WITH FORMAT, stats;’  –@var2 replaces path and filename of backup


Now execute both @sql1 and @sql2 statements

EXEC sp_executesql @sql1

EXEC sp_executesql @sql2


You can also export query result to file with small program called BCP.
To call BCP from SQL Query you need to use a function called xp_cmdshell. But to use this function, it must be enabled in SQL Server Advanced options.
The quickest way is to write TSQL for it

— Set advanced options to be changed.
EXEC sp_configure ‘show advanced options’, 1
— Update the change
— To enable the xp_cmdshell feature.
EXEC sp_configure ‘xp_cmdshell’, 1
— To update the currently configured value for this feature.


Now we can export it with the following statement:

exec master..xp_cmdshell ‘bcp "select top 500 * from sysobjects" queryout "c:\id\tempexportfile.txt" -c -t, -T -S ‘+@@servername

path to file – is referenced to server side path
– C use default code page for char, varchar or text columns
-t     use TAB as field terminator
-T    use trusted connection to server
-S  use server –S servername