Happy new year

It seems there is a new year every year, and it’s customary to sum up the year that has been and make predictions for the year to come.

Last year I had great expectations for 2014, but as always things do seldom work out as expected. Professionally 2014 was not as productive as I had expected.
I hoped to work with Business Intelligence on a corporate level, that didn’t happen. But a sourcing app in the Data Warehouse is now adopted by all Business Areas. (The year before I completely failed to show the similarities between Business Intelligence and Master Data Management for the logistics people in the group, the idea was to demonstrate how they could pick low hanging fruits with reports they already had.)
I hoped to launch Group common Master Data Management project, that didn't happen, the business areas were not ready to start, but I have great hope for this year we have a kick off meeting scheduled. Dedecoretur qui succumbit.
Due to various reasons I did not do much IT architectural work other things come in between.
On the more personal level I hoped to blog less, and learn JavaScript. That did not happen, I blogged a bit more than 2014, and only did tiny bits of JavaScript, I probably know JavaScript less well today.
A highly appreciated colleague decided to leave the company, this is natural colleagues come and go (I have done the same), but sometimes it hurts more.
I probably did some good things too but I focus on the negative, it’s the personality, you know the glass is half empty.

But I’m not too depressed, 2015 will be better.
The common Master Data management project will start.
Hopefully I will work with enterprise data integration. (I was involved in the creation of the company’s first data integration system the Funnel, this system will be buried on the graveyard of unsung heroes 2015 after some 35 years of service.) 2015 we will introduce the publish and subscribe integration model. 
Right now I’m testing a new model  integrating vendors of components for our products, I’m far from convinced this will fly though.

I will do more JavaScript coding.
The Perl6 project will ship their first production release, I read it on the Internet so it must be true!!
I will try to read about information warfare systems engineering. It is of vital interest  for the democratic world to keep the upper hand in information warfare, and I want to know what happens in this area.

I hope 2015 will be better for me than 2014. On a global scale I fear things will grow even worse 2015. Writing about personal ambitions for the next year is maybe meanly. E.g. we are frying the world and fooling ourselves we can do something about it. If we were serious about this we should drastically decrease the production of fossil fuel, this will only happen after some billion people have died in natural catastrophes as a consequence of global warming.
I have an ambition to start a blog in Swedish, a language I master, writing about politics, I do not for a second think this will have an impact, but it might make me feel better.

Despite this post I sincerely wish you all - A happy new year!


Deep machine learning and intelligence

Stockholm, Gärdet 2014-12-29

Andreas sent me this link, where Jeremy Howard talks about machine learning and an emerging IT revolution were computers replace humans for a bit more advanced intellectual work than today’s ERP, CAD and etc. systems are capable of. Very interesting TED speech, if I understand mr. Howard right he implies the technological singularity is imminent with the help of an algorithm called deep learning, earlier this year I found a prediction on Wikipedia this will happen around 2040, I doubt. According to mr. Howard the necessary building blocks for super human intelligence are already here, we just have to put the pieces together better, and this will go fast. What I am absolutely certain about, the pace of computer ability is accelerating exponentially, every new prediction tend to be closer in time, but for superhuman intelligence. 
While predictions for computer ability in general is similar to the predictions of the mappings of the (human) genomes, it was thought to take an excruciatingly long time until Craig Venter came along. I think the predictions for the singularity more will resemble of predictions of the commercial breakthrough of the fuel cell, it is always twenty years in future.

This doesn't mean computers are not getting smarter. Computers or rather computer programmes are transcending from aiding decision making to make decisions, this has been ongoing for some time now and this pace is accelerating, we are in the dawning of a new computer revolution where intellectual workers will be replaced by computers. And this is pretty important, in five to ten years we will have a new world, with new opportunities, not all good.
Some weeks ago I participated in a seminar at KTH discussing digitalisation of business opportunities. The participants were top IT managers from some of Sweden's biggest corporations plus me. One participant told us in near future no one will let humans take care of their financial investments ‘Do you let a human do your investments? Are you stupid!’ We were all in agreement that this new IT revolution has already begun and there are great (business-) opportunities for us all.

But for computers being smarter than us I think we have a long way to go. Mr Howard is most probably right, machine learning is important for computer development. He says ‘Machine learning revolution never settles down’, how does he know? There are capping limits to all things under the sun (and beyond), ultimately physical limits like the thermodynamical laws. Quantum mechanics most likely has limitations too. In my somewhat pessimistic post I state man has the ultimate intellect, we can figure out anything logical just given time. But it might so happen speed is what smartness is all about, if so then computer programmes will prevail over man. Shortly after man will most likely be annihilated, fortunately for me this will only happen twenty years from now.


Why do you use PHP?

My nephew asked me ‘why do you use PHP’? Only one of my nephews can ask such a question, the other one is playing ice hockey in US and has no more interest in programming than the rest of the Clan. On Christmas Eve we all met for Christmas dinner, (the other nephew participated via Skype). Why PHP?

Year 2000 I was back working for Atlas Copco. Up until then I had only used SAS Institute software in Windows and Unix environments (I was a mainframe guy at heart still am if the truth should be told). I done some Bourne shell scripting and some Windows bat files and a bit of C and Java but nothing serious. At the time I was about to create a Business Intelligence system, I had lots of business people crying for better reports, but no one (top management) was willing to pay for a top notch SAS Institute BI system. I decided to go the other way, instead of spending big bugs I decided to go for a minimal budget, just for fun if nothing else.
I started with a scrapped PC where I installed a Mandrake Linux and Mysql. Since I had written the Mainframe ERP system myself I had no problem extracting the marketing and production figures the business people wanted to analyse. But how the ---- should I get the data into the MySql database? I needed a language with a MySql interface where I could express some rudimentary logic like signal if data loading into MySql failed.
I needed a simple language with a low step-in threshold, I didn’t have the time to learn enough C or Linux shell scripting, When I looked around I came up with two candidates Perl and PHP. I been playing with Perl but PHP was new to me. Perl was the obvious choice, I knew the language, it was mature and well suited for the task, PHP vs 4 beta was a primitive language for (simple) web server scripting a very awkward choice for what I had in mind. But PHP was new and unknown to me I just could not resist to try it out and I have a taste for betas, try out new functionality is fun. When the other kids on the block come screaming ‘look what I have done!’ I enjoy saying ‘I did it years ago’, it boosts my ego. And I like to go my own way. The BI system I created was against every IT policy the company had, I could not do it according to the company rules, either this new system or the old paper reports the users did not want. After some years my initial very simple PHP script had evolved into a new language ITL. So here I am year 2015 with a language of my own for a BI system based on PHP running on Linux. PHP has evolved and today it is a mature and in many ways a good language, it is hard to dynamically execute code, parallel execution is a bit odd and I do not know if asynchronous I/O can be done, but otherwise pretty much anything can be done with a simple hilevel syntax.
I would not do systems programming in PHP, for that I would use C,  assembler or the D language which I find very interesting.

Would I use PHP if I start all over again with my BI system? No there are new fresh languages out there waiting to be explored. Recently I have used Node.js, but Python or Perl6 is probably the language I would go for.
If I would do something in the area of Enterprise Integration an area I have taken an interest in lately, I would start with Erlang OTP, if I only could get my head around Erlang, the language seems to be made for guys smarter than me.

I have never regretted my choice of PHP, the language has evolved nicely and has an interesting future with new projects like Hack, the next version of PHP itself seems to be almost revolutionary in terms of performance. For each version PHP looks more and more like the scripting language next door, a bit boring maybe but it is good for the programming community and the world.


Merry Christmas from PhpMetrics

If this was a non-figurative Christmas decoration made by PhpMetrics, I would be really impressed. But it is the evaluation of my code work for the Data Warehouse :(  
Thanks PhpMetrics for the appreciation of my hard work.
PHPmetrics is a software that analyse PHP code and visualise the result, the more red the bigger circles the the worse code.
PhpMetrics is great, extremely simple to install and use and what a Christmas decoration! The colors of Christmas red and green plus some yellow. This could be a trivet for Christmas made by Alvar Aalto.

I will try to understand why my code rate so poorly. Browsing the documentation I saw eval and goto rate bad. Eval; this evil instruction - I had no other option in PHP, I was forced to use it. As for goto I’m of another opinion, sometimes goto is just the right instruction. I saw PhpMetrics also analysed some other software products I use, it took the entire directory structure, I suspect the combination may explain some red spots. I do not use much OO code and there are some horrid test programs still in there. And I use one GLOBAL in almost every function (accessing the log routine). And the bulk of the code is written between 2004-2007. Lots of excuses, maybe it is as simple as 'the code is crap', or PhpMetrics is good for creating nice looking images with little meaning.

Any way the picture is great, I simply love it.
Try to beat my code, create a more Christmassy graph with PhpMetrics if you can.

From me to all (well almost) of you - A Very Merry Christmas.


SAP SNC integration, Enter Powershell

For me shell programming mostly is to automate simple IT admin tasks like reorganising database tables, compact file systems etc. I’m not very good at doing the same thing twice so I tend to write ‘admin procedures’ for many small tasks. During the years I used a lot of shell scripting languages. The first I used was IBM mainframe EXEC/CLIST (Command List) and JCL (Job Control language). JCL is not normally not regarded as a shell scripting language, anyway that was many years ago both these languages were pretty primitive/crappy. In all fairness IBM has also constructed one great (shell) scripting language REXX.

When you write a shell script you often need to have a good understanding and control of the environment and context where your script runs. God interfaces to the underlying opsys and commands are essential as predefined constants of script name opsys version, current working directory etc. My ‘shell processes’ often consists of an initial part maintaining configuration parameters and another part acting upon the 'computer environment' with the help of these parameters.
Now I’m creating an interface between SAP SNC and External ERP systems, a task where a shell scripting procedure may seem an odd choice, but I got some odd requirements, the interface must work on a standard Windows desktop, very easy to distribute and handle, since the users of this interface are not expected to be IT-savvy. Anyway I thought why not give Powershell a go, and contain the entire interface in one script, for easy export/import. If you read the link above you know SAP SNC spits out a rather ugly CSV file and the receivers expects a decent structured XML file. I decided to do the file metamorphosis from CSV to XML in two steps, first transform the CSV file into a SAP SNC XML format, and then transform that XML file with a XSL script into the final XML format. It was surprisingly easy to create a draft. IT only took some hours due to some powerful functions in Powershell, e.g. a one liner to parse the complex SNC csv file:
$olArray = Get-Content ‘CSV-FILE-NAME’ | Select-Object -Skip 11 | ConvertFrom-Csv

Another very pleasant surprise was the abundance of useful information I found on the web, I could pinch a lot of code directly from blog posts. I didn't expect that, in the past I found free (as in free beer) Microsoft information rare.
It is simple to create popups for text input and file selection, which is of great value for configuration. 

But there were some nasties too, the worst is actually really bad, functions reply with all output produced e.g:
return ‘OK’
does not necessarily  only return ‘OK’ but all? output created in the function is also pushed on the ‘return stack’, this forces you to some extraordinary cleansing after the call:
$returnValues = function "$Parameter"
$OKstring = $returnValues[-1]  # pop the last value off the ‘return stack’
This is exacerbated by the fact any object missing a verb in the code is automatically transferred to standard output making it appear on the ‘return stack’, this make it hard to find all things pushed onto the return stack. I find this behavior very strange. And it took me some time to figure it out. I hope I’m wrong about this or there is a clear ‘return stack’ option like  - return -only ‘OK’

The Powershell ISE hangs every now and then when I use popups, which seems to be placed haphazardly on the screen depending on Windows versions.
All in all I find Powershell scripting a pleasant experience, a very powerful language although a bit quirky. I have only been coding for about 15 hours with no training upfront, so I might have got things wrong.

My script now takes this:
to this:

I’m happy with my script. It is an easy configurable script able to run on most Windows system, so I hope. I have written the script with Powershell versions 3 and 4 in mind, but now I have realised version 2 is still very much used, so I may have to back port my script to vs 2.


Stop write BOM in UTF-8 encoded files

I’m writing my first Powershell script, which converts CSV files to XML files.
So far it has been an easy ride, but this kept me occupied for some hours.
I looked at my newly created XML file in Google Chrome:

Looks quite pretty, doesn't it?
Now look at the same file in Microsoft Internet Explorer:

Yes that's right, just blanks. Why is that?
This is a snippet from my Powershell script, I think the comments say it all:
# Out-File writes a %&@£€ BOM header
# transXSL "$L" "$xsl" | Out-File $out # DOES NOT WORK! since Out-File writes a %&@£€ BOM header
# So I have to write the file like this so MS Internet Explorer can read the file.

$myXML = transXSL "$L" "$xsl"
[System.IO.File]::WriteAllLines($out, $myXML)

# Google Chrome has no problems mit or mitout the %&@£€ BOM header
# Why in Gods name does anyone force a BOM header onto UTF-8 files

The BOM character is valid in UTF-8 encoding. Just because you are allowed to litter does not mean you have to. Stop put crap into UTF-8 files. And when you read a BOM in UTF-8 files, just throw it away and proceed with next character.
I would like to have a Out-File writing UTF-8 files without BOM characters.

Here you can see the bugger, the first characters are the BOM character, confusing lot of softwares.