Powershell and Lotus Notes

As you might know, Powershell works best with other Microsoft products, like MS Exchange.

Now what happens when you want to connect PS to some non Microsoft products? In my case that was Lotus Notes. The collaboration giant. As you probably know, Lotus has its own LotusScritp language, with whom you may create Lotus databases, agents, users, etc… With it you can do almost anything in Lotus.
Since I like to test and experiment with Powershell, and one of my projects was Powershell script and Exchange, Self-service user portal, I decided to play with Powershell and Lotus Notes. And I must say, I works.

First of all, If you want to connect to Lotus Notes via PS, you must start powershell or PS ISE in 32bit mode.

#open powershell in 32bit mode
#Start-Process $Env:WINDIR\SysWOW64\WindowsPowerShell\v1.0\powershell.exe
#or ISE
#Start-Process $Env:WINDIR\SysWOW64\WindowsPowerShell\v1.0\powershell_ise.exe
if([Environment]::Is64BitProcess -eq $true)
                    write-output "64bit NO GO" 
            else {
                        write-output "32bit OK"
                        } #because you have a 64-bit PowerShell

Once you open the PS or ISE (I’m more of ISE or Visual Code person), you can start connecting to Lotus Notes.

One of my tasks, was to extract an HTML attachment which was sent everyday by our backup system. But with this code, will go through all mail documents in Inbox, and for those that have attachment, it will create a folder for every sender and create date subfolder with attachments in it.

$strUserView = '$Inbox'  
$DomServer = "SERVER/LN" 
$DomDBPath = "mail\user.nsf" 
$pwd4NotesDB = "Passw0rd" 
$ipPath2Export = "C:\Temp"

$DomSession = New-Object -ComObject Lotus.NotesSession #Use LN COM class
$DomSession.Initialize($pwd4NotesDB) #This is when Lotus asks for your password when you open it
$DomDatabase = $DomSession.GetDatabase($DomServer,$DomDBPath) #Initialize Database
$DomView  = $DomDatabase.GetView($strUserView) #Initialize View

Since the script has to go through all the mail, we need the loop.

$Counterf = $DomView.GetFirstDocument() #Define first document in view
While ($CounterF -ne $nul) {

$DomNexDocument = $DomView.GetNextDocument($CounterF) #Define Next doc

#Define some fields
$DomeLoopSubject = $CounterF.GetItemValue("Subject") #GetSubject
$DomeLoopFrom = $CounterF.GetItemValue("From") #GetFrom
$DomeLoopDate = $Counterf.GetItemValue("DeliveredDate")
$DomeLoopDate = '{0:yyyyMMdd}' -f $DomeLoopDate #GetDateandFormat

#for folder name we need to clean FROM

IF ($Counterf.HasEmbedded -eq "True") #IF attachment exists
      $AttachItem = $Counterf.GetFirstItem("Body") #Get all attachments     
       Foreach($A in $attachItem.EmbeddedObjects)
            $DomAttachSavePath = "$ipPath2Export"
            $DOmAttachSavePath = $DOmAttachSavePath+$DomeLoopFrom 
            Write-Output "Possible path  $DomAttachSavePath"
            New-Item -ItemType Directory -Force -Path $DomAttachSavePath >$null
            $Extractto = $DomAttachSavePath+"\"+$FILENAME       

$CounterF = $DomNexDocument #Raise counter

So, this is how I managed to export all attachments from my Inbox in Lotus Notes.

Good Luck

Enable / Re-enable mount of ISO image in Windows 8

Yesterday I installed CD Burner XP. After that I couldn’t mount ISO images anymore.



The solution to this problem is in file association.

Right click on ISO file, select properties;



Choose button change on Opens with:



From the list, choose Windows Explorer


Close properties window, and then try the right click menu on ISO file



Good Luck

Space, the final frontier….

We all need space….

Once upon a time I had a 250GB HDD, and it was back then large. Everything I needed was on that disk. But as the files grew, so did my hard got smaller. On that disk, I have lot of pictures and music, maybe some movie and bunch of documents. Suddenly I ran out of disk space…

Quest No.1 – Understand consumption

My first quest was, first to find what consumes so much space.  For that I used a free disk analyzer software called Space Sniffer. It is a standalone exe file.
You can choose a drive, or a path. works also with UNC paths. Then press Start.image

Very intuitive and practical software. It graphically shows the size of every folder or file. This depends on scaling the details. You go a level deeper by clicking on a particular square. From this screen, you can also open, edit and delete files and folders. Best of all, this software is free, and at the end you can create a log file.

Quest No.2 – Duplicity

As the files grew in numbers, there has also been a problem of some laziness. Some files got duplicated. After googleing and trying couple of software, I found one, also free of charge. Fast Duplicate file Finder from MindGems. Easy to install and easy to work with.

Works with data files, audio and picture files.
For test I have crated a three folders, and put various duplicate files in them. First add folder, and setup scan properties. If you think files are similar, then choose similar (but slower method) or find 100% identical. I will leave it at 100%. Press SCAN.

After it scanned you get report like this, sorted by duplicate groups.

Duplicates are automatically checked. So you can probably delete them and save some space.

So, that’s how I got several GB of space free.

Good luck

TSQL – working with folders and output to file

This article shows how to create a folder for backup. The idea is to create folder with date part in its name, and then backup a specific database to that folder.

First we need to create a parameters to save our SQL command in it.
use master
declare @sql1 nvarchar(max)
declare @sql2 nvarchar(max)
declare @var1 varchar(100)
declare @var2 varchar(100)

–So first we create a folder, manually, called BACKUP. In this folder we will save our subfolders and backups.
set @var1 = N’"C:\BACKUP\SQLDB ‘+(select convert(varchar,getdate(),104))+’"’    –(CONVERT turns the date into dd.MM.yyyy format)

–save to @sql1 statement you wish to execute
set @sql1 = N’exec xp_create_subdir ‘+@var1

–next, we have to crate a variable which will contain path and name of a backup file.
set @var2 = N’C:\BACKUP\SQLDB ‘+(select convert(varchar,getdate(),104))+’\Database_’+convert(varchar,getdate(),112)+’.bkp’

–now, create @sql2 statement you wish to execute
set @sql2 =N’BACKUP DATABASE Database TO DISK = ‘ + QUOTENAME( @var2 , ”” )+’ WITH FORMAT, stats;’  –@var2 replaces path and filename of backup


Now execute both @sql1 and @sql2 statements

EXEC sp_executesql @sql1

EXEC sp_executesql @sql2


You can also export query result to file with small program called BCP.
To call BCP from SQL Query you need to use a function called xp_cmdshell. But to use this function, it must be enabled in SQL Server Advanced options.
The quickest way is to write TSQL for it

— Set advanced options to be changed.
EXEC sp_configure ‘show advanced options’, 1
— Update the change
— To enable the xp_cmdshell feature.
EXEC sp_configure ‘xp_cmdshell’, 1
— To update the currently configured value for this feature.


Now we can export it with the following statement:

exec master..xp_cmdshell ‘bcp "select top 500 * from sysobjects" queryout "c:\id\tempexportfile.txt" -c -t, -T -S ‘+@@servername

path to file – is referenced to server side path
– C use default code page for char, varchar or text columns
-t     use TAB as field terminator
-T    use trusted connection to server
-S  use server –S servername