2019-12-17

Strategy and a Tribute concert

Last weekend we had a company future strategy meeting at 11Dim. The board was represented by me, Erik and Axel. We had some special invited experts Adela Juras,  Victor Hjelm  and Andreas Röine Larsson. We are especially grateful for Adela's insights in Human interfaces and Project management,  Victor's  hard work at the meeting  and Andreas presentation on the experiences from Avicii Tribute concert, where Andreas company Devlib were responsible for part of the network and the tills.
The guest invitees all contributed substantially to the 2020 strategy plan, Victors experience of product development proved to be invaluable.


We spent the better part of the Saturday discussing future directions, brand identity and development of our own software products and ended the meeting looking at the Avicii Tribute concert.

After interesting discussions on machine learning, advanced analytics and agile and lean project managements, we decided to concentrate on present Business Intelligence activities focus on including MS Power BI in our BI suite The Data Warehouse. Axel pointed out that we at the moment do not have resources to start up new large projects, he do not have the time to take our billing system to production status 2020, but we decided try to progress to alpha status during next year. We decided not to start up new projects in machine learning and advanced analytics, Erik was asked to follow trends and prepare us for a cautious start of machine learning 2021.


Then we had a rather long brainstorming session about brand identity and rebranding.
Decisions made:
Investigate the possibilities to change the company name to the shorter 11Dim AB.
For logo, colors, shapes and other visual elements we decided to use the inspiration of Linton Kwesi Johnson's 1980 Album Bass Culture. Both the music and the minimalist graphical design of the album is in line with our company values, although we could not agree how this inspiration could translate into a brand promise, we postponed the brand promise to next regular board meeting.

However we decided to translate the man walking down the stairs on the Bass Culture Album into 'the data warehouse man' walking up the stairs with colors going from light to more saturated colors from SW to NE further enhancing the impression of the strong progress of 11Dim AB. Erik promised to create a draft for a new graphical design and a logo.

As the CEO of 11Dim I want to thank all, the board, employees, associates and customers helping us at 11Dim make 2019 such a successful year.
The Data Warehouse Man







Board members wish you all a merry Christmas and a Happy new Year!:
Lars   -  CEO
Axel  -  Developer & Data Scientist 
Erik   -  CFO,  Analyst & Graphic design 

2019-12-10

Catch 22

I have a SharePoint list that passed the dreaded 5000 limit.
Trying to add an indexed to the list results in:

Sorry, something went wrong

The attempted operation is prohibited because it exceeds the list view threshold enforced by the administrator.


In order to overcome the dreaded 5000 limit in SharePoint you need to create a view on an index item. But you cannot create an index on a list with more than 5000 items.

Hey SharePoint Expert,
Please make a comment tell me what I missed.🤢

2019-10-19

Unwanted intrusion

Recently I bought a tablet computer. After just a few days of use it started to beep every now and then. After a week this became a real pain. Lots of  'pling and plong', I had no clue what the alarm was about until I looked at settings for notifications. I found about 30! different sources that send me notifications, I had probably given my consent by answer 'Yes' on these stupid prompts that many sites issue to let you in. I thought this was a GDPR mandated prompt for cookie consent or something similar. . Now I have blocked them all, but I wonder how do I get rid of these notifications completely? How many tons of coal are burnt every day just to send stupid notifications no one is interested of?  I do not want your fucking notifications.

2019-10-15

Mainframe assembler

By mistake I stumbled upon some very old libraries of mine, with mainframe assembler code. Not just a few programs with a few lines of code, but tons of code, complete program products from days gone by. I had almost forgotten my days as a mainframe assembler developer, While browsing the code it all came back I was an autodidact, starting by reading Principles of operations, and some assembler manuals and endless sleepless nights, it was fun. I started with a simple program reading the master operator console, responding to some prompts. After some years I created mini operating systems, a database manager, a super fast search engine, interacting with CICS,IMS, DB2, RACF. During a few years I was mesmerized by the possibilities the assembler language gave you. Then I gave it all up for consulting with SAS Institute program products. When I started my consulting career, I just had a weekend of private studies of the entire SAS Institute package. First weeks I just faked, but that's another story.

If you really know assembler and have a good library of macros, you can be amazingly productive, since you have direct contact with the operating system and the computer. If you also can do you hexadecimal calculating well, operating system dumps are easy like books to read. It would be fun to go back to my roots and start a second career as an assembler programmer.

Here is a 'repossessed' macro I used for initialize programs:       


         MACRO                                                        
&NAME    LJENTRY &BREGS,&VERSION=,&WAREA=GETMAIN,&RSAVE=(14,12),       X
               &WAREAN=WAREA                                            00030074
.**********************************************************************
.* THIS MACRO SHOULD BE USED TO START ALL THE MODULES WHICH FORM      *
.* PART OF THE PROPER PROGRAM PRODUCT.                                *
.*                                                                    *
.* FUNCTIONS INCLUDE:-                                                *
.*                                                                    *
.*     1. SAVING CALLERS REGISTERS.                                   *
.*     2. CHAINING SAVE AREAS.                                        *
.*     3. BASE REGISTER ASSIGNMENT.                                   *
.*     4. SET UP OF COPYRITE NOTICE.                                  *
.*     5. CLEAR ASSIGNED WORKING STORAGE TO X'00'.                    *
.*        TO CLEAR THE WORK AREA THIS MACRO USES REGISTER 6,7,8 AND 9.*
.*                                                                    *
.* PARMS:                                                             *
.*     BREGS - NUMBERS OF BASE-REGISTERS                              *
.*    VERSION - RELEASE VERSION AND PTF LEVEL (X.X.X)                 *
.*      WAREA:                                                        *
.*          NONE    MODULE DO NOT HAVE A WORKING STORAGE SECTION      *
.*          (R)     A(WORKING STORAGE) PROVIDED BY CALLER IN R        *
.*          GETMAIN WORKING STORAGE IS OBTAINED BY PROGRAM.           *
.*          A-TYPE ADDRESS WORKING STORAGE RESIDES IN PROGRAM CSECT   *
.*                                                                    *
.*        IF WAREA IS (R) OR GETMAIN THIS MODULE REQUIRES THAT THE    *
.*        PROGRAM HAS A DSECT DEFINED AS FOLLOWS:                     *
.*                                                                    *
.*         NAME     DSECT                                             *
.*         SAVEAREA DS   18F'0'                                       *
.*         NAMEL    EQU  *-NAME                                       *
.*                                                                    *
.*          NAME  IS SPECIFIED IN WAREAN UNLESS WAREA IS AN           *
.*                A-TYPE ADDRESS                                      *
.*                                                                    *
.*        IF WAREA IS AN A-TYPE ADDRESS THIS MODULE REQUIRES THAT     *
.*        SPACE IS RESERVED FOR REGISTER SERVING PURPOSE AT THE       *
.*        BEGINING OF WAREA                                           *
.*                                                                    *
.*      RSAVE:                                                        *
.*          IGNORE  CALLERS REGISTERS WILL NOT BE SAVED               *
.*          (RX,RY) CALLERS REGISTERS WILL BE SAVE BY A STM OPERATION *
.*        IF CALLED MODULE REQUEST REGISTERS TO BE SAVED (AND CHAINED)*
.*        THIS MACRO REQUIRES THAT THE CALLER PROVIDES AN 18F AREA    *
.*        POINTED TO BY R13.                                          *
.*                                                                    *
.* THE FIRST 18 FULLWORDS OF THE WORK AREA CONTAIN THE SAVE AREA FOR  *
.* THIS MODULE. ANY WORKING STORAGE WHICH MUST BE UPDATED SHOULD BE   *
.* DEFINED WITHIN THIS WORK AREA AFTER THE SAVEAREA DEFINITION.       *
.*                                                                    *
.* CREATED APRIL    1986. BY L. JOHANSSON LAJ SOFTWARE PRODUCTION.    *
.**********************************************************************
         GBLC  &SREGS              REGS TO SAVE    GLOBAL FOR END MACRO
         GBLC  &WORK               AQUIRE WAREA    GLOBAL FOR END MACRO
         GBLC  &WAREAN1            WORK AREA NAME  GLOBAL FOR END MACRO
         LCLC  &BREG1              BASE REGISTER 1
&BREG1   SETC  '&BREGS(1)'
&SREGS   SETC  '&RSAVE'(2,5)
&WORK    SETC  '&WAREA'
&WAREAN1 SETC  '&WAREAN'
.*
         AIF   ('&SREGS' EQ 'GNORE').SAVE1    SAVE CALLERS REGS?
.SAVE    ANOP                                 YES WE SAVE CALLERS REGS
         STM   &SREGS.,12(R13)     SAVE CALLERS REGISTERS
         AGO   .BASE
.SAVE1   ANOP                                 NO WE DONT SAVE CALLERS R
         BC    0,4(0,15)           STEP LOC COUNT EQ WITH STM
.BASE    ANOP
         LR    &BREG1,15       SET UP 1:ST BASE.
         AIF   (N'&BREGS EQ 1).BASE1
         AIF   (N'&BREGS EQ 2).BASE2
         MNOTE 16,'LJENTRY - SPECIFY ONE OR TWO BASE REGISTER(S)'
         MEXIT
.BASE1   ANOP
         USING &SYSECT,&BREG1      ONE BASE REGISTER
         AGO   .CONST
.BASE2   ANOP
         LCLC  &BREG2              BASE REGISTER 2
&BREG2   SETC  '&BREGS(2)'
         USING &SYSECT,&BREG1,&BREG2 TWO BASE REGISTERS
.CONST   ANOP
         B     CONS&SYSNDX         BRANCH AROUND CONSTANT AREA.
         AIF   ('&WAREA' EQ 'NONE').CONST1
         AIF   ('&WAREA'(1,1) NE '(' AND '&WAREA' NE 'GETMAIN').CONST1
         DC    A(&WAREAN1.L)       LENGTH OF WORK AREA
         AGO   .CONST4
.CONST1  ANOP                      WORKAREA STARTS AT A LABEL IN MODULE
         DC    A(*-*)              NO WORKAREA LENGTH = 0
.CONST4  ANOP
         DC    AL1(CONS&SYSNDX-*)  LENGTH OF LABEL
         DC    CL8'MODULE -'       EYE CATCHER.
         DC    CL8'&SYSECT'        MODULE NAME.
         DC    CL8'VERSION-'       EYE CATCHER.
         DC    CL8'&VERSION'       VERSION.RELEASE.LEVEL .
         DC    CL8'&SYSDATE'       DATE COMPILED.
         COPYRITE
CONS&SYSNDX DS    0H
         AIF   (N'&BREGS LT 2).WAREA
         LA    &BREG2,2048(0,&BREG1)  LOAD UP ...
         LA    &BREG2,2048(0,&BREG2)    2ND BASE REGISTER.
.WAREA   ANOP
         AIF   ('&WAREA' EQ 'NONE').SLUT
         AIF   ('&WAREA' NE 'GETMAIN').WAREA1
         GETMAIN R,LV=&WAREAN1.L   GET WORKING STORAGE FOR MODULE.
         LR    R6,R1               CLEAR ...
         LA    R7,&WAREAN1.L          GOTTEN ...
         SR    R9,R9                    WORK AREA ...
         SR    R8,R8                      TO ...
         MVCL  R6,R8                         ZEROES.
         AGO   .CHAIN
.WAREA1  ANOP
         AIF   ('&WAREA' EQ '(1)').CHAIN
         AIF   ('&WAREA'(1,1) EQ '(').WAREAR
         LA    R1,&WAREA           LOAD A(WORKAREA)
         AGO   .CHAIN
.WAREAR  ANOP
         LR    R1,&WAREA           LOAD A(WORKAREA)
.CHAIN   ANOP
         ST    13,4(0,1)           CHAIN
         ST    1,8(0,13)             SAVE AREAS.
         LR    13,1                NEW S-A ADDRESS.
         USING &WAREAN1.,R13      MAKE WAREA ADDRESSABLE.
         L     R1,4(R13)           R1 ==> CALLERS SAVE AREA.
         L     R0,20(R1)           R0 = R0 ON ENTRY.
         L     R1,24(R1)           R1 = R1 ON ENTRY.
.SLUT    MEND 

2019-09-08

Pseudo Delta Loading of SAP RESB

2012 I wrote the first post on how to extract material reservations from SAP contained in the table RESB. At least I could not find anything useful then. I still consider my post is the definite guide how to offload active material reservations from SAP. If not the definite guide a very good one not involving any ABAP code. To my sorrow I found the code examples are blank pictures in my old post.

Due to a recent unlucky incident my workflow was taken out of production and I had to reinstate the RESB offload workflow. While doing this I realized it might be of interests for some SAPers to read an use ideas from the old post, If anyone is interested I will try to fill in the missing parts of my original post.

It has gone some years now since I wrote my post, you might know of a better way to 'delta' load from RESB, if so please drop me a mail of comment here 

2019-08-31

RAID-6 in peril


I been away from the office most of this week, Yesterday I was back and decided to do a visual inspection of the Data Warehouse servers, to my horror I saw this on the very important Database Server:








Two failed disks in a RAID-6 array, not something you like to see, but thank God I chose RAID-6 over RAID-5 when I configured the server 2012. Yesterday was the last Summer Friday this year, all persons responsible for the server was out of office, afters some asking around I got the number to DELL support, after an insane number of robot questions, the robot handed me over to the human switchboard lady who to my surprise understood my problem and connected me to human support Raymond who efficiently took me through a small set of reasonable questions, then told me 'alright I send you two disks they should arrive in four hours' (they arrived within two hours). Raymond was not only helpful and efficient he also understood my last name JOHANSSON and could spell it right, something impossible for most native english speakers. 'JO.. what? can you spell it please' is the normal response when I say my last name and I'm terrible at spelling in english. I avoid spelling by telling my last name is Johansen which is the Norwegian variant of my last name and put a heavy emphasis on second syllable JoHANsen and add "it's double S".  That works both in Britain and US. But Raymond got it right the first time he just asked 'is it double S'.
One of the guys responsible for the server came in to the office fetched the disks at the reception and we together hot swapped one of the failed disks. Now some 24 hours later it is 60% recreated. I really hope it will be 100% recreated before another disk pops. The disks are from 2012, even the failing disks have done a very good job in continuous heave duty since then. It's the monitoring that is failing, and it's partly my fault. Wish me good luck I really need it.

Update 2019-08-31 16:20 CET 91% recreated still counting and looking good☺
Update 2019-08-31 17:20 CET 100% recreated☺

Now we should replace the other failing disk, but my younger son son took my car earlier this afternoon and I do not go to the office in the weekend by public transport to replace a disk, so that have to wait until tomorrow or to Monday.






2019-08-28

Shoot yourself in the foot

All programmers have shot themselves in the foot, I am no exception. Over the years I have shot myself in the foot so many times I have no toes left figuratively speaking that is. I accidentally almost cut off my right big toe forty-five years ago but that is another story. Programatically shoot yourself in the foot is about destroy a good program by accidentally inserting a few lines of self destructing code.

"Hey Lars, can you help us? We need to transfer a csv file from Qlikview into the Data Warehouse in real time."
"Why do you ever want to do that? Its like putting the cart before the horse. Actually it's even worse." I said.
"Yes we know, but now we have this setup and it would be of great help if we could send the csv file from Qlikview to the Data Warehouse."

I started to play with some code in the PowerShell ISE environment. It is a very nice environment to play, develop and debug code. The environment is not cleared between code executions, this is a mixed blessing, normally this is a neat feature all your variables are kept so you can just insert or add code and proceed run your program. This time however it just created problems as I commented out parts of the code between iterations so I inserted this line in top of my code to clear the ISE environment::

Remove-Variable * -ErrorAction SilentlyContinue; Remove-Module *; $error.Clear(); Clear-Host 

This worked nicely and in no time I developed a script that first FTP the csv file to the Data warehouse server the via a call to the Data Warehouse server invoked a DW workflow inserting the csv file into the Data Warehouse. I then wrapped the PowerShell code into a function 'sendFileToDW' (see below) suppling the csv file, the Data Warehouse & the workflow as parameters; now I could insert a file into the Data Warehouse by:

sendFile2DW "File.csv" "data warehouse" "workflow.xml"



Very nice except nothing happened, just some error messages. Whatever I did the parameters were blank whatever I did, being inexperienced with PowerShell functions I wrongly anticipated it was something wrong with my function setup. I tried God knows how many different parameter definitions, after hours of futile attempts I scrutinized the function code and then I found:

Remove-Variable * -ErrorAction SilentlyContinue; Remove-Module *; $error.Clear(); Clear-Host 

a bit down in the code, removing that line, the function worked as expected. This is a perfect example of shooting yourself in the foot.



function sendFile2DW {
     Param ([Parameter(Position=0)] [string]$inputfile, [Parameter(Position=1)] [string]$dwuser, [Parameter(Position=2)] [string]$dwschedule) 
      
$x = @"
This scripts FTP a file to the Data Warehouse and then executes a schedule in the Data Warehouse.

Parms:
1 - file to send 
2 - the Data Warehouse
3 - Schedule to run in the Data Warehouse (workflow)

Processing:
1 - First we do necessary configurations by initialize some variables.
2 - Then we FTP
3 - Finally we submit a schedule for execution in the Data warehouse and and wait for exection to finish

Note! The Data Warehouse web server must be up and running before submitting the schedule! 

Set-ExecutionPolicy -Scope CurrentUser
"@ 
    $myParms = "my parms - $inputfile, $dwuser, $dwschedule"
    write-host "$myParms"
    
    # Start of configuration
    # Config for the SMTP server and mail sender & recipients for error messages
    $SMTPServer = "X1"
    $mailFrom = 'Qlik2DW <X2>'
    $mailTo = 'X3, '
    #$SMTPPort = "587"

    # Config for the Data Warehouse FTP site
    $FTPserver = 'X4';
    $FTPmap    = 'X5';
    $FTPuser   = "X6" 
    $FTPpw     = "X7"

    # Config for the Data Warehouse web server schedule kicker
    $dwwebserver = "X8"

    # End of configuration. Do not touch anything below! 

    #Before we start, where are we?
    $cwd = Get-Location 
    write-host "$MyInvocation.ScriptName runs in $cwd"
    Remove-Variable * -ErrorAction SilentlyContinue; Remove-Module *; $error.Clear(); Clear-Host # wiping out vars :(
    write-host "FTP upload file=$inputfile , FTP = $FTPserver, $FTPmap, $FTPuser"

    $FileName = [System.IO.Path]::GetFileName("$inputfile");
    $FTPfileName = $FileName
    $FTPfile = $FTPserver + "/" + $FTPmap + $FTPfileName 
    write-host "Start uploading file $inputfile to FTP as $FTPfile"
    #write-host "ftp://${FTPuser}:${FTPpw}@$FTPserver/$FTPfileName"

    $client = New-Object System.Net.WebClient
    $client.Credentials = New-Object System.Net.NetworkCredential("$FTPuser", "$FTPpw") 
    $uri = New-Object System.Uri("ftp://$FTPfile") 
    
    try {
        $client.UploadFile($uri, $inputfile)
        $client.close
    } catch {
        $_.Exception.Message
        $emsg = "The FTP upload failed aborting..., please check!"
        Write-Host "$emsg"
        Send-MailMessage -From $mailFrom  -To $mailTo -SmtpServer $SMTPServer -Priority High -DeliveryNotificationOption OnFailure  -Subject 'Error' `
           -Body "$_.Exception.Message `n $emsg `n $myParms"   
        Exit
    } 
    write-host "FTP upload of $inputfile done" 


    write-host "Submitting schedule $dwschedule to $dwwebserver for execution in the Data Warehouse" 

    try {
        $wr = Invoke-WebRequest -Headers @{"Cache-Control"="no-cache"} http://$dwwebserver/dw/$dwuser/$dwschedule -method get
    } catch {
        $_.Exception.Message
        $emsg = "The DW Web server may be down, please check!"
        Write-Host "$emsg"
        Send-MailMessage -From $mailFrom  -To $mailTo -SmtpServer $SMTPServer -Priority High -DeliveryNotificationOption OnFailure  -Subject 'Error' `
           -Body "$_.Exception.Message `n $emsg `n $myParms"   
        Exit
    }

    $wr.InputFields | Where-Object {
        $_.name -like "*"
    } | Select-Object Name, Value

    $dwpid = $wr.Content
    $dwstsdesc = $wr.StatusDescription
    $dwstscd = $wr.StatusCode
    $wr.close

    Write-Host "Executing the schedule $dwschedule in the Data Warehouse job PID=$dwpid Status=$dwstsdesc ($dwstscd)"

    If ($dwstscd -ne 200) {
        $emsg = "Oops, An unexpected error occurred! (Status=$dwstscd)"
        Write-Host "$emsg"
        Send-MailMessage -From $mailFrom  -To $mailTo -SmtpServer $SMTPServer -Priority High -DeliveryNotificationOption OnFailure  -Subject 'Error' `
           -Body "$emsg `n $myParms" 

         Exit 
    }

    Write-Host "Waiting for schedule $dwschedule pid=$dwpid to finish, content=$dwpid, desc=$dwstsdesc, statuscd=$dwstscd"

    do {
        Start-Sleep -s 1
        $wrp = Invoke-WebRequest http://$dwwebserver/pid/$dwpid -method get
    #    Write-Host $wrp.Content
        $wrp.InputFields | Where-Object {
            $_.name -like "*"
        } | Select-Object Name, Value
    } while ( $wrp.Content -eq 0)
    $wrp.close

    Write-Host "Done, the schedule $dwschedule is executed in the Data Warehouse"
    #Pause
}




2019-08-21

Rename Perl6 to Perly

As a complete outside spectator of The Perl6 naming disaster I have proposed a rename to PerlyGate to keep conotations to both old Perl and all the religious mumbo jumbo Perl developers are so fond of. But my suggestion seemed to go unnoticed, might be due to the fact this blog is not read by anyone.
But I give it another try. Why not PerlySix, which keeps the reference to Perl6 and at the same time is a new name separated from Perl. Or just Perly a new name keeping the reference to Perl. Both Perly and PerlySix avoids the version number confusion with Perl.

Do the rename of Perl6, and version update Perl to Perl 7, bumping over the unfortunate Perl 6.
Perly can keep the version number 6 or start from 1 both options sounds good to me. 

Update: As I understand Perl6 is now renamed to Raku. To me it sounds very 'pale', no flair whatsoever, but I'm not an native English speaker, so what do I know. Anyway it is the language that counts, not the name.

2019-07-11

Data Scientist

For some years now I have seen the new profession "Data Scientist" appear more and more frequently in articles, papers and job advertisements. It seems like every employer want to hire Data Scientists. Lately I have begun to think "How cool wouldn't it be to be a Data Scientist", just the word Scientist has a certain aura around it.Yes I want to become a Data Scientist, so I Googled "what is a data scientist". A lot of hits, and they all replied about the same, IT professional who can help the business to analyse the business (this is one of the best of those I found, go directly to the video clip by Kirk Borne, PhD, Principal Data Scientist at Booz Allen Hamilton. I'm a bit bored at the "this is not an IT function, this is a business function", the weight is definitely  on the IT side.).

The result was sort of an anticlimax, Data Scientist seems to be what I have done part time more or less my entire career. But I have given myself more humble titles, report writer, programmer, data analyst, BI developer, it is basically the same as the more gaudy "Data Scientist". Of course the methods have changed over time in the old cobol paper days, the options were more limited, but the core "help the business find and analyse the data" is very much the same. I have now upgraded myself with the flamboyant title Data Scientist.

I can hear some people say "but you have never understood the business". I have created an MRP system and an BI system, how can you do that without understand the business? When I left the company the first time the IT manager told me "you are the only one in the IT department who fully understand the material flow in the company". That was not really the case, I understood the material flow well in the business area producing mine equipment, for the other business areas my knowledge was more superficial. Nevertheless I think I qualify for the Data Scientist title😎

 

2019-06-23

Email automation, C# and PowerShell

Some time ago I did an Email automation application in my Data Warehouse. In April this year I got a
request from “the business”, ‘can you help me develop a macro that creates PDFs’? ‘Uh?’ I replied.
The colleague had a very large Excel workbook with +hundred sheets each of them should be separate PDF files. I’m really bad at Excel I have problems with auto sums, never had written a macro I only done some visual basic scripts. But I promised to look at it and then I forgot all about it. Imay my colleague asked if I had looked at the Excel, I promised to look at it which I did. I wondered why my college wanted PDFs is that not more to it? And then I forgot about it. In June my colleague asked again if I had had time to look at her Excel, now I felt i had to do something, so I asked if the PDFs was all she wanted, can you explain what the source for this Excel is and what you are going to do with the PDFs.. I got a long story about compile material for invoices and create this multi-sheet Excel,create PDFs and send them to up to six recipients of which she only had the names not email addresses. As always when you have an old manual process inherited from person to person
explained to you it sounds complex. I did not understand much about it, but I understood she had a lot of mails to send to a lot of names/recipients.

This was more than I could do with an Excel macro.
I had a large multi-sheet  Excel, each sheet should be sent to a company, and the recipients  
in another Excel Sheet with six columns and one row per receiving company..

First I created a PowerShell script splitting the multi-sheet Excel into separate Excels. I asked if I could attach
an excel to each mail instead of cut and paste a PDF into the mail body as they did. This was one of the odd  tasks in this manual process, this was just done as it was the way they always had done this. I was told this was actually much better with Excel attachments, since these invoices were no legal invoices, just notifications to subsidiaries within the Enterprise.

Next I took care to the translations of recipients names into emailaddresses. I created a C# program
listing the Global Address List, with it I should be able to translate any Enterprise Employee name into
an Email address, of course I was wrong, I missed about 30% of the recipients, it turned out not all subsidiaries was incorporated into the Enterprise mailsystem. I then listed all users in the Active Directory, this time I used PS scripting, after the C# adventure I decided to go for PS scripts, it is much faster to develop and fast enough for this kind of automation. Unfortunately I still missed about 10% of the recipients. For those I created a list manually and also accepted email addresses directly in the recipients list. I had to extract data from two ‘got-it-all’ sources and still had to add a manual source and allow for name and email address specifications of recipients. If I had one source for all recipients I would have saved quite some time, only the C# program extracting GAL took me the better part of a Sunday.

I was given a static mail body to be included in the mails. So the last thing I did was a PS script for the mail
automation. PS scripting is actually very nice for Email automation, just for the hell of it I added variable substitution for the mail template so I could personalize the mails.

Doing this Email automation exercise led me to some conclusions:
PowerShell scripting is very good for automation of manual tasks
Good H/R master data is essential for automation of tasks where people/employees are  involved.
Automation is fun. having done the output part of a manual process, I’m looking forward to take a stab at the input part.

In the next post I show some code:)

2019-06-12

Recruitment process out of control

The recruitment process has gone haywire or rather been kidnapped by HR. These days when you apply for a new job you do not only send in the CV and then eventually be selected for interviews by HR and recruiting manager and maybe a subject matter test of some kind. No these days you first have to pass a HR test over the Internet most like created by a hired recruitment company often evaluated by a so called robot. These tests are supposed to measure your IQ and ability to solve complicated problems. I have come across a few such test and they look more like bullshit than anything else in my eyes. You have very little time for each question e.g. the instruction take longer time to read (for a dyslectic) than to solve/reply the question. That is a test for speed reading not for what I think the question is intended to answer. And what does the ability to fill in the missing picture in a Sudoku matrix has to do with the ability to solve complex problems? One day of training on these tests improves your score a lot. Do that day increased you IQ and your ability to solve complex problems? I doubt. 
I am confident this new sifting test process is more harmful than helpful.
I recently heard this true story about a very distinguished science professor who had founded a  research company and let a recruitment company manage the recruitment process. To his disappointment his best PhD student did not apply, so he called him up and asked 'why didn't you apply?'. 'I didn't pass the test' was the reply. The professor called up the recruitment company and told them 'you have turned down a world leading expert' and requested to see the tests and all result. He then took over the recruitment process and hired the PhD who failed the test.     

2019-06-02

In memoriam

In some posts I have written about colleagues who have left for new challenges outside the company. This post is different, last week Carina passed away after a short period of severe illness. We did not work together, but we were colleagues, when I was an IT manager she was employed as a secretary. Later when I moved to the HQ she worked as an IT admin and the last years a communicator for the corporate IT  in the same office as I work. I who used to came in first in the morning and leave last in the evening was second when Carina joined us. She was dedicated and hard working like few, but that is not what she is remembered for, it is her positive attitude always cheerful and friendly, we always had a a small chat before the other came in. When she occasionally left before me, I used to say "Working part time now, are we?" And we laughed a bit about that. There is no more "Good morning Lars!" when I arrive at the office. It is funny I did not realized how much that actually meant to me until she become ill just a few months ago. There is a big empty space in the office, farewell Carina my thoughts are with you and your family.

2019-05-15

OData: Request failed (404)

I'm trying to get data from SharePoint via an Odata query to Power BI using the Odata connector. My query is perfectly legal in a browser (or Powershell) but in Power BI the query results in a 404 return code. Googling about the error the best reply I get is:
the above error could occur when OData URL string is large or large number of columns are pulled into Power BI Desktop. Please check your URL and try to reduce columns, then check if the error goes away. 
I cannot find any limits or actually anything that tells me what the f-ng error is. I reduce my request to a minimum and I still got the same 404 responce.
It seems the only Odata I successfully process is a simple get one list  Odata request. So I download needed lists one by one and the by clicking around manage to join the lists in Power BI and find this code in the advanced editor:

let
   Source = SharePoint.Tables("https://thecompanysharepoint/metadata/application", [ApiVersion = 15]),
   #"cffb758e-476c-4673-84f5-7be931f86eb2" = Source{[Id="cffb758e-476c-4673-84f5-7be931f86eb2"]}[Items],
   #"Renamed Columns" = Table.RenameColumns(#"cffb758e-476c-4673-84f5-7be931f86eb2",{{"ID", "ID.1"}}),
   #"Merged Queries" = Table.NestedJoin(#"Renamed Columns",{"Currency CodeId"},ISOCurrencyCode,{"Id"},"ISOCurrencyCode"
,JoinKind.LeftOuter),
   #"Expanded ISOCurrencyCode" = Table.ExpandTableColumn(#"Merged Queries", "ISOCurrencyCode", {"ContentTypeID"
, "CurrencyCode", "CurrencyName", "CurrencyNumCode", "CurrencyISOEdition", "CurrencyStatusValue", "CurrencyComment"
, "CurrencyAC_Reporting", "Id", "ContentType", "Modified", "Created", "CreatedById", "ModifiedById", "Owshiddenversion", "Version"
, "Path", "CurrencyStatus", "CreatedBy", "ModifiedBy", "Attachments"}, {"ISOCurrencyCode.ContentTypeID", "ISOCurrencyCode.CurrencyCode",
"ISOCurrencyCode.CurrencyName", "ISOCurrencyCode.CurrencyNumCode", "ISOCurrencyCode.CurrencyISOEdition"
, "ISOCurrencyCode.CurrencyStatusValue", "ISOCurrencyCode.CurrencyComment",
"ISOCurrencyCode.CurrencyAC_Reporting", "ISOCurrencyCode.Id", "ISOCurrencyCode.ContentType", "ISOCurrencyCode.Modified"
, "ISOCurrencyCode.Created", "ISOCurrencyCode.CreatedById", "ISOCurrencyCode.ModifiedById", "ISOCurrencyCode.Owshiddenversion"
, "ISOCurrencyCode.Version", "ISOCurrencyCode.Path", "ISOCurrencyCode.CurrencyStatus", "ISOCurrencyCode.CreatedBy"
, "ISOCurrencyCode.ModifiedBy",
"ISOCurrencyCode.Attachments"}),
   #"Filtered Rows" = Table.SelectRows(#"Expanded ISOCurrencyCode", each ([Month] = 2) and ([Year] = "2016"))
in
   #"Filtered Rows"
I modified this code with great care to get a result I'm happy with since I cannot find any reference or guidance at all for this scripting language which I even do not know the name of.
Anyway now it works, and I got the requested result.