2019-08-31

RAID-6 in peril


I been away from the office most of this week, Yesterday I was back and decided to do a visual inspection of the Data Warehouse servers, to my horror I saw this on the very important Database Server:








Two failed disks in a RAID-6 array, not something you like to see, but thank God I chose RAID-6 over RAID-5 when I configured the server 2012. Yesterday was the last Summer Friday this year, all persons responsible for the server was out of office, afters some asking around I got the number to DELL support, after an insane number of robot questions, the robot handed me over to the human switchboard lady who to my surprise understood my problem and connected me to human support Raymond who efficiently took me through a small set of reasonable questions, then told me 'alright I send you two disks they should arrive in four hours' (they arrived within two hours). Raymond was not only helpful and efficient he also understood my last name JOHANSSON and could spell it right, something impossible for most native english speakers. 'JO.. what? can you spell it please' is the normal response when I say my last name and I'm terrible at spelling in english. I avoid spelling by telling my last name is Johansen which is the Norwegian variant of my last name and put a heavy emphasis on second syllable JoHANsen and add "it's double S".  That works both in Britain and US. But Raymond got it right the first time he just asked 'is it double S'.
One of the guys responsible for the server came in to the office fetched the disks at the reception and we together hot swapped one of the failed disks. Now some 24 hours later it is 60% recreated. I really hope it will be 100% recreated before another disk pops. The disks are from 2012, even the failing disks have done a very good job in continuous heave duty since then. It's the monitoring that is failing, and it's partly my fault. Wish me good luck I really need it.

Update 2019-08-31 16:20 CET 91% recreated still counting and looking good☺
Update 2019-08-31 17:20 CET 100% recreated☺

Now we should replace the other failing disk, but my younger son son took my car earlier this afternoon and I do not go to the office in the weekend by public transport to replace a disk, so that have to wait until tomorrow or to Monday.






2019-08-28

Shoot yourself in the foot

All programmers have shot themselves in the foot, I am no exception. Over the years I have shot myself in the foot so many times I have no toes left figuratively speaking that is. I accidentally almost cut off my right big toe forty-five years ago but that is another story. Programatically shoot yourself in the foot is about destroy a good program by accidentally inserting a few lines of self destructing code.

"Hey Lars, can you help us? We need to transfer a csv file from Qlikview into the Data Warehouse in real time."
"Why do you ever want to do that? Its like putting the cart before the horse. Actually it's even worse." I said.
"Yes we know, but now we have this setup and it would be of great help if we could send the csv file from Qlikview to the Data Warehouse."

I started to play with some code in the PowerShell ISE environment. It is a very nice environment to play, develop and debug code. The environment is not cleared between code executions, this is a mixed blessing, normally this is a neat feature all your variables are kept so you can just insert or add code and proceed run your program. This time however it just created problems as I commented out parts of the code between iterations so I inserted this line in top of my code to clear the ISE environment::

Remove-Variable * -ErrorAction SilentlyContinue; Remove-Module *; $error.Clear(); Clear-Host 

This worked nicely and in no time I developed a script that first FTP the csv file to the Data warehouse server the via a call to the Data Warehouse server invoked a DW workflow inserting the csv file into the Data Warehouse. I then wrapped the PowerShell code into a function 'sendFileToDW' (see below) suppling the csv file, the Data Warehouse & the workflow as parameters; now I could insert a file into the Data Warehouse by:

sendFile2DW "File.csv" "data warehouse" "workflow.xml"



Very nice except nothing happened, just some error messages. Whatever I did the parameters were blank whatever I did, being inexperienced with PowerShell functions I wrongly anticipated it was something wrong with my function setup. I tried God knows how many different parameter definitions, after hours of futile attempts I scrutinized the function code and then I found:

Remove-Variable * -ErrorAction SilentlyContinue; Remove-Module *; $error.Clear(); Clear-Host 

a bit down in the code, removing that line, the function worked as expected. This is a perfect example of shooting yourself in the foot.



function sendFile2DW {
     Param ([Parameter(Position=0)] [string]$inputfile, [Parameter(Position=1)] [string]$dwuser, [Parameter(Position=2)] [string]$dwschedule) 
      
$x = @"
This scripts FTP a file to the Data Warehouse and then executes a schedule in the Data Warehouse.

Parms:
1 - file to send 
2 - the Data Warehouse
3 - Schedule to run in the Data Warehouse (workflow)

Processing:
1 - First we do necessary configurations by initialize some variables.
2 - Then we FTP
3 - Finally we submit a schedule for execution in the Data warehouse and and wait for exection to finish

Note! The Data Warehouse web server must be up and running before submitting the schedule! 

Set-ExecutionPolicy -Scope CurrentUser
"@ 
    $myParms = "my parms - $inputfile, $dwuser, $dwschedule"
    write-host "$myParms"
    
    # Start of configuration
    # Config for the SMTP server and mail sender & recipients for error messages
    $SMTPServer = "X1"
    $mailFrom = 'Qlik2DW <X2>'
    $mailTo = 'X3, '
    #$SMTPPort = "587"

    # Config for the Data Warehouse FTP site
    $FTPserver = 'X4';
    $FTPmap    = 'X5';
    $FTPuser   = "X6" 
    $FTPpw     = "X7"

    # Config for the Data Warehouse web server schedule kicker
    $dwwebserver = "X8"

    # End of configuration. Do not touch anything below! 

    #Before we start, where are we?
    $cwd = Get-Location 
    write-host "$MyInvocation.ScriptName runs in $cwd"
    Remove-Variable * -ErrorAction SilentlyContinue; Remove-Module *; $error.Clear(); Clear-Host # wiping out vars :(
    write-host "FTP upload file=$inputfile , FTP = $FTPserver, $FTPmap, $FTPuser"

    $FileName = [System.IO.Path]::GetFileName("$inputfile");
    $FTPfileName = $FileName
    $FTPfile = $FTPserver + "/" + $FTPmap + $FTPfileName 
    write-host "Start uploading file $inputfile to FTP as $FTPfile"
    #write-host "ftp://${FTPuser}:${FTPpw}@$FTPserver/$FTPfileName"

    $client = New-Object System.Net.WebClient
    $client.Credentials = New-Object System.Net.NetworkCredential("$FTPuser", "$FTPpw") 
    $uri = New-Object System.Uri("ftp://$FTPfile") 
    
    try {
        $client.UploadFile($uri, $inputfile)
        $client.close
    } catch {
        $_.Exception.Message
        $emsg = "The FTP upload failed aborting..., please check!"
        Write-Host "$emsg"
        Send-MailMessage -From $mailFrom  -To $mailTo -SmtpServer $SMTPServer -Priority High -DeliveryNotificationOption OnFailure  -Subject 'Error' `
           -Body "$_.Exception.Message `n $emsg `n $myParms"   
        Exit
    } 
    write-host "FTP upload of $inputfile done" 


    write-host "Submitting schedule $dwschedule to $dwwebserver for execution in the Data Warehouse" 

    try {
        $wr = Invoke-WebRequest -Headers @{"Cache-Control"="no-cache"} http://$dwwebserver/dw/$dwuser/$dwschedule -method get
    } catch {
        $_.Exception.Message
        $emsg = "The DW Web server may be down, please check!"
        Write-Host "$emsg"
        Send-MailMessage -From $mailFrom  -To $mailTo -SmtpServer $SMTPServer -Priority High -DeliveryNotificationOption OnFailure  -Subject 'Error' `
           -Body "$_.Exception.Message `n $emsg `n $myParms"   
        Exit
    }

    $wr.InputFields | Where-Object {
        $_.name -like "*"
    } | Select-Object Name, Value

    $dwpid = $wr.Content
    $dwstsdesc = $wr.StatusDescription
    $dwstscd = $wr.StatusCode
    $wr.close

    Write-Host "Executing the schedule $dwschedule in the Data Warehouse job PID=$dwpid Status=$dwstsdesc ($dwstscd)"

    If ($dwstscd -ne 200) {
        $emsg = "Oops, An unexpected error occurred! (Status=$dwstscd)"
        Write-Host "$emsg"
        Send-MailMessage -From $mailFrom  -To $mailTo -SmtpServer $SMTPServer -Priority High -DeliveryNotificationOption OnFailure  -Subject 'Error' `
           -Body "$emsg `n $myParms" 

         Exit 
    }

    Write-Host "Waiting for schedule $dwschedule pid=$dwpid to finish, content=$dwpid, desc=$dwstsdesc, statuscd=$dwstscd"

    do {
        Start-Sleep -s 1
        $wrp = Invoke-WebRequest http://$dwwebserver/pid/$dwpid -method get
    #    Write-Host $wrp.Content
        $wrp.InputFields | Where-Object {
            $_.name -like "*"
        } | Select-Object Name, Value
    } while ( $wrp.Content -eq 0)
    $wrp.close

    Write-Host "Done, the schedule $dwschedule is executed in the Data Warehouse"
    #Pause
}




2019-08-21

Rename Perl6 to Perly

As a complete outside spectator of The Perl6 naming disaster I have proposed a rename to PerlyGate to keep conotations to both old Perl and all the religious mumbo jumbo Perl developers are so fond of. But my suggestion seemed to go unnoticed, might be due to the fact this blog is not read by anyone.
But I give it another try. Why not PerlySix, which keeps the reference to Perl6 and at the same time is a new name separated from Perl. Or just Perly a new name keeping the reference to Perl. Both Perly and PerlySix avoids the version number confusion with Perl.

Do the rename of Perl6, and version update Perl to Perl 7, bumping over the unfortunate Perl 6.
Perly can keep the version number 6 or start from 1 both options sounds good to me. 

Update: As I understand Perl6 is now renamed to Raku. To me it sounds very 'pale', no flair whatsoever, but I'm not an native English speaker, so what do I know. Anyway it is the language that counts, not the name.