Happy new year

It seems there is a new year every year, and it’s customary to sum up the year that has been and make predictions for the year to come.

Last year I had great expectations for 2014, but as always things do seldom work out as expected. Professionally 2014 was not as productive as I had expected.
I hoped to work with Business Intelligence on a corporate level, that didn’t happen. But a sourcing app in the Data Warehouse is now adopted by all Business Areas. (The year before I completely failed to show the similarities between Business Intelligence and Master Data Management for the logistics people in the group, the idea was to demonstrate how they could pick low hanging fruits with reports they already had.)
I hoped to launch Group common Master Data Management project, that didn't happen, the business areas were not ready to start, but I have great hope for this year we have a kick off meeting scheduled. Dedecoretur qui succumbit.
Due to various reasons I did not do much IT architectural work other things come in between.
On the more personal level I hoped to blog less, and learn JavaScript. That did not happen, I blogged a bit more than 2014, and only did tiny bits of JavaScript, I probably know JavaScript less well today.
A highly appreciated colleague decided to leave the company, this is natural colleagues come and go (I have done the same), but sometimes it hurts more.
I probably did some good things too but I focus on the negative, it’s the personality, you know the glass is half empty.

But I’m not too depressed, 2015 will be better.
The common Master Data management project will start.
Hopefully I will work with enterprise data integration. (I was involved in the creation of the company’s first data integration system the Funnel, this system will be buried on the graveyard of unsung heroes 2015 after some 35 years of service.) 2015 we will introduce the publish and subscribe integration model. 
Right now I’m testing a new model  integrating vendors of components for our products, I’m far from convinced this will fly though.

I will do more JavaScript coding.
The Perl6 project will ship their first production release, I read it on the Internet so it must be true!!
I will try to read about information warfare systems engineering. It is of vital interest  for the democratic world to keep the upper hand in information warfare, and I want to know what happens in this area.

I hope 2015 will be better for me than 2014. On a global scale I fear things will grow even worse 2015. Writing about personal ambitions for the next year is maybe meanly. E.g. we are frying the world and fooling ourselves we can do something about it. If we were serious about this we should drastically decrease the production of fossil fuel, this will only happen after some billion people have died in natural catastrophes as a consequence of global warming.
I have an ambition to start a blog in Swedish, a language I master, writing about politics, I do not for a second think this will have an impact, but it might make me feel better.

Despite this post I sincerely wish you all - A happy new year!


Deep machine learning and intelligence

Stockholm, Gärdet 2014-12-29

Andreas sent me this link, where Jeremy Howard talks about machine learning and an emerging IT revolution were computers replace humans for a bit more advanced intellectual work than today’s ERP, CAD and etc. systems are capable of. Very interesting TED speech, if I understand mr. Howard right he implies the technological singularity is imminent with the help of an algorithm called deep learning, earlier this year I found a prediction on Wikipedia this will happen around 2040, I doubt. According to mr. Howard the necessary building blocks for super human intelligence are already here, we just have to put the pieces together better, and this will go fast. What I am absolutely certain about, the pace of computer ability is accelerating exponentially, every new prediction tend to be closer in time, but for superhuman intelligence. 
While predictions for computer ability in general is similar to the predictions of the mappings of the (human) genomes, it was thought to take an excruciatingly long time until Craig Venter came along. I think the predictions for the singularity more will resemble of predictions of the commercial breakthrough of the fuel cell, it is always twenty years in future.

This doesn't mean computers are not getting smarter. Computers or rather computer programmes are transcending from aiding decision making to make decisions, this has been ongoing for some time now and this pace is accelerating, we are in the dawning of a new computer revolution where intellectual workers will be replaced by computers. And this is pretty important, in five to ten years we will have a new world, with new opportunities, not all good.
Some weeks ago I participated in a seminar at KTH discussing digitalisation of business opportunities. The participants were top IT managers from some of Sweden's biggest corporations plus me. One participant told us in near future no one will let humans take care of their financial investments ‘Do you let a human do your investments? Are you stupid!’ We were all in agreement that this new IT revolution has already begun and there are great (business-) opportunities for us all.

But for computers being smarter than us I think we have a long way to go. Mr Howard is most probably right, machine learning is important for computer development. He says ‘Machine learning revolution never settles down’, how does he know? There are capping limits to all things under the sun (and beyond), ultimately physical limits like the thermodynamical laws. Quantum mechanics most likely has limitations too. In my somewhat pessimistic post I state man has the ultimate intellect, we can figure out anything logical just given time. But it might so happen speed is what smartness is all about, if so then computer programmes will prevail over man. Shortly after man will most likely be annihilated, fortunately for me this will only happen twenty years from now.


Why do you use PHP?

My nephew asked me ‘why do you use PHP’? Only one of my nephews can ask such a question, the other one is playing ice hockey in US and has no more interest in programming than the rest of the Clan. On Christmas Eve we all met for Christmas dinner, (the other nephew participated via Skype). Why PHP?

Year 2000 I was back working for Atlas Copco. Up until then I had only used SAS Institute software in Windows and Unix environments (I was a mainframe guy at heart still am if the truth should be told). I done some Bourne shell scripting and some Windows bat files and a bit of C and Java but nothing serious. At the time I was about to create a Business Intelligence system, I had lots of business people crying for better reports, but no one (top management) was willing to pay for a top notch SAS Institute BI system. I decided to go the other way, instead of spending big bugs I decided to go for a minimal budget, just for fun if nothing else.
I started with a scrapped PC where I installed a Mandrake Linux and Mysql. Since I had written the Mainframe ERP system myself I had no problem extracting the marketing and production figures the business people wanted to analyse. But how the ---- should I get the data into the MySql database? I needed a language with a MySql interface where I could express some rudimentary logic like signal if data loading into MySql failed.
I needed a simple language with a low step-in threshold, I didn’t have the time to learn enough C or Linux shell scripting, When I looked around I came up with two candidates Perl and PHP. I been playing with Perl but PHP was new to me. Perl was the obvious choice, I knew the language, it was mature and well suited for the task, PHP vs 4 beta was a primitive language for (simple) web server scripting a very awkward choice for what I had in mind. But PHP was new and unknown to me I just could not resist to try it out and I have a taste for betas, try out new functionality is fun. When the other kids on the block come screaming ‘look what I have done!’ I enjoy saying ‘I did it years ago’, it boosts my ego. And I like to go my own way. The BI system I created was against every IT policy the company had, I could not do it according to the company rules, either this new system or the old paper reports the users did not want. After some years my initial very simple PHP script had evolved into a new language ITL. So here I am year 2015 with a language of my own for a BI system based on PHP running on Linux. PHP has evolved and today it is a mature and in many ways a good language, it is hard to dynamically execute code, parallel execution is a bit odd and I do not know if asynchronous I/O can be done, but otherwise pretty much anything can be done with a simple hilevel syntax.
I would not do systems programming in PHP, for that I would use C,  assembler or the D language which I find very interesting.

Would I use PHP if I start all over again with my BI system? No there are new fresh languages out there waiting to be explored. Recently I have used Node.js, but Python or Perl6 is probably the language I would go for.
If I would do something in the area of Enterprise Integration an area I have taken an interest in lately, I would start with Erlang OTP, if I only could get my head around Erlang, the language seems to be made for guys smarter than me.

I have never regretted my choice of PHP, the language has evolved nicely and has an interesting future with new projects like Hack, the next version of PHP itself seems to be almost revolutionary in terms of performance. For each version PHP looks more and more like the scripting language next door, a bit boring maybe but it is good for the programming community and the world.


Merry Christmas from PhpMetrics

If this was a non-figurative Christmas decoration made by PhpMetrics, I would be really impressed. But it is the evaluation of my code work for the Data Warehouse :(  
Thanks PhpMetrics for the appreciation of my hard work.
PHPmetrics is a software that analyse PHP code and visualise the result, the more red the bigger circles the the worse code.
PhpMetrics is great, extremely simple to install and use and what a Christmas decoration! The colors of Christmas red and green plus some yellow. This could be a trivet for Christmas made by Alvar Aalto.

I will try to understand why my code rate so poorly. Browsing the documentation I saw eval and goto rate bad. Eval; this evil instruction - I had no other option in PHP, I was forced to use it. As for goto I’m of another opinion, sometimes goto is just the right instruction. I saw PhpMetrics also analysed some other software products I use, it took the entire directory structure, I suspect the combination may explain some red spots. I do not use much OO code and there are some horrid test programs still in there. And I use one GLOBAL in almost every function (accessing the log routine). And the bulk of the code is written between 2004-2007. Lots of excuses, maybe it is as simple as 'the code is crap', or PhpMetrics is good for creating nice looking images with little meaning.

Any way the picture is great, I simply love it.
Try to beat my code, create a more Christmassy graph with PhpMetrics if you can.

From me to all (well almost) of you - A Very Merry Christmas.


SAP SNC integration, Enter Powershell

For me shell programming mostly is to automate simple IT admin tasks like reorganising database tables, compact file systems etc. I’m not very good at doing the same thing twice so I tend to write ‘admin procedures’ for many small tasks. During the years I used a lot of shell scripting languages. The first I used was IBM mainframe EXEC/CLIST (Command List) and JCL (Job Control language). JCL is not normally not regarded as a shell scripting language, anyway that was many years ago both these languages were pretty primitive/crappy. In all fairness IBM has also constructed one great (shell) scripting language REXX.

When you write a shell script you often need to have a good understanding and control of the environment and context where your script runs. God interfaces to the underlying opsys and commands are essential as predefined constants of script name opsys version, current working directory etc. My ‘shell processes’ often consists of an initial part maintaining configuration parameters and another part acting upon the 'computer environment' with the help of these parameters.
Now I’m creating an interface between SAP SNC and External ERP systems, a task where a shell scripting procedure may seem an odd choice, but I got some odd requirements, the interface must work on a standard Windows desktop, very easy to distribute and handle, since the users of this interface are not expected to be IT-savvy. Anyway I thought why not give Powershell a go, and contain the entire interface in one script, for easy export/import. If you read the link above you know SAP SNC spits out a rather ugly CSV file and the receivers expects a decent structured XML file. I decided to do the file metamorphosis from CSV to XML in two steps, first transform the CSV file into a SAP SNC XML format, and then transform that XML file with a XSL script into the final XML format. It was surprisingly easy to create a draft. IT only took some hours due to some powerful functions in Powershell, e.g. a one liner to parse the complex SNC csv file:
$olArray = Get-Content ‘CSV-FILE-NAME’ | Select-Object -Skip 11 | ConvertFrom-Csv

Another very pleasant surprise was the abundance of useful information I found on the web, I could pinch a lot of code directly from blog posts. I didn't expect that, in the past I found free (as in free beer) Microsoft information rare.
It is simple to create popups for text input and file selection, which is of great value for configuration. 

But there were some nasties too, the worst is actually really bad, functions reply with all output produced e.g:
return ‘OK’
does not necessarily  only return ‘OK’ but all? output created in the function is also pushed on the ‘return stack’, this forces you to some extraordinary cleansing after the call:
$returnValues = function "$Parameter"
$OKstring = $returnValues[-1]  # pop the last value off the ‘return stack’
This is exacerbated by the fact any object missing a verb in the code is automatically transferred to standard output making it appear on the ‘return stack’, this make it hard to find all things pushed onto the return stack. I find this behavior very strange. And it took me some time to figure it out. I hope I’m wrong about this or there is a clear ‘return stack’ option like  - return -only ‘OK’

The Powershell ISE hangs every now and then when I use popups, which seems to be placed haphazardly on the screen depending on Windows versions.
All in all I find Powershell scripting a pleasant experience, a very powerful language although a bit quirky. I have only been coding for about 15 hours with no training upfront, so I might have got things wrong.

My script now takes this:
to this:

I’m happy with my script. It is an easy configurable script able to run on most Windows system, so I hope. I have written the script with Powershell versions 3 and 4 in mind, but now I have realised version 2 is still very much used, so I may have to back port my script to vs 2.


Stop write BOM in UTF-8 encoded files

I’m writing my first Powershell script, which converts CSV files to XML files.
So far it has been an easy ride, but this kept me occupied for some hours.
I looked at my newly created XML file in Google Chrome:

Looks quite pretty, doesn't it?
Now look at the same file in Microsoft Internet Explorer:

Yes that's right, just blanks. Why is that?
This is a snippet from my Powershell script, I think the comments say it all:
# Out-File writes a %&@£€ BOM header
# transXSL "$L" "$xsl" | Out-File $out # DOES NOT WORK! since Out-File writes a %&@£€ BOM header
# So I have to write the file like this so MS Internet Explorer can read the file.

$myXML = transXSL "$L" "$xsl"
[System.IO.File]::WriteAllLines($out, $myXML)

# Google Chrome has no problems mit or mitout the %&@£€ BOM header
# Why in Gods name does anyone force a BOM header onto UTF-8 files

The BOM character is valid in UTF-8 encoding. Just because you are allowed to litter does not mean you have to. Stop put crap into UTF-8 files. And when you read a BOM in UTF-8 files, just throw it away and proceed with next character.
I would like to have a Out-File writing UTF-8 files without BOM characters.

Here you can see the bugger, the first characters are the BOM character, confusing lot of softwares.


SAP SNC integration, part 1

Recently I have had the pleasure to dig into SAP SNC (Supply Network Collaboration) download/upload interface. SNC is a portal for suppliers, I have no experience myself of SNC, but it is a portal where suppliers can log in and work in our SAP ERP, one key area in probably vendor managed inventory.
SAP SNC also supports up- and download of files for suppliers that do not want to enter transactions in real time but upload them in a file or download status reports etc. For this SAP SNC uses a rather complicated Comma Separated Value file format.

The Company have many small suppliers that want to interface SAP SNC automatically with their own ERP system. In order to do so they need to transform the CSV files of SNC into a format their ERP systems understand. And here is where we and the suppliers have a problem. They do not have an IT department that can transform SAP SNC complicated csv files into a format their ERP system understands. The SAP support guys at the company turned to me for help. We could go one of two ways, either do the transformation inside SAP SNC so the suppliers communicate with  files in ‘their’ formats or we give the suppliers transformation routines. Creating the transformation inside SAP SNC would take long time and be costly. I decided to try to build a transformation routine in the suppliers MS Windows environment. I did not want to rely on software not standard in MS Windows, I wanted to use as much standard concepts and software as possible. These days XML file format is a well established standard and it seems most of our suppliers prefer XML formats. It is actually a mystery to me why SAP decided to go for csv formats for these integration files, SAP is normally a strong proponent of XML integration, even when XML is not a good choice. XML can be very verbose and if you send information about very many object these streams may constipate your network. XML and SAP own IDOC format for SNC integration looks ideal to me. Maybe I have misunderstood something, but as it is now I have to deal with the csv format, IDOCs would have been much easier to transform.
There is no standard for parsing and transforming csv files and certainly not the csv files of SAP SNC, but there is an established and good standard for XML file transformation XSLT. You can use XSLT to transform csv files but it is complicated and XSLT is not designed for that. I decided to built my own SAP SNC csv file parser with tools available in a standard MS Windows PC. I did not want to invest too much time and efforts in this solution, since I’m convinced SAP will redesign this interface in not to far future. I did not want to change my procedure for every ‘supplier’ format. I also wanted to give the suppliers something they could modify themself. And lastly I wanted to learn something new. I’m not very comfortable in Windows so almost anything except bat files would be a learning experience. After studying the alternatives I decided to use Powershell scripting, something completely new to me. After browsing an introductory tutorial and some Googling I started to build my csv file parser. The supplier have to download the SNC cvs file, the run my parser script and then upload the transformed file into their ERP system. This parser is not beta tested yet, but when beta testing is done I will write a post about the parser.

I think this SNC up- and download integration is badly designed by SAP. If I had designed this schnittstelle, it should at least provide you with three options, use either CSV, XML/IDOC or JSON file formats.What I really like to see is an easy2use high-level toolbox in SAP for application integration. I think SAP customers are worth something better than present csv files, and I believe SAP can afford to develop such an interface. I might have missed something, is there something already available I’m not aware of?


Bad Design

Part 1.

This summer my dishwasher broke down. It was a Miele from the nineties, a very good dish washer I was happy with. A bit noisy but it washed my dish all right. But this summer the water pump broke. Instead of replacing the water pump I decided to buy a modern dishwasher, I decided to go for a Bosch machine, and I installed it some weeks ago. It does a good job, the dish is clean, but still I’m a bit disappointed, the new machine is a tiny bit less noisy but it takes longer time to do the dish, and I do not like that. And it doesn’t dry the dish as good as the old one, I do not like that either. But  what really bugs me is the cutlery basket of the Bosch dishwasher:
In front the old lighter Miele cutlery basket and behind the darker Bosch basket.

As you can see the handle of the Miele basket is strengthened  by two beams for better stability. The Bosch handle is reinforced by beams on the inside of the handle creating perfect pockets trapping condensing steam making sure you always get wet and water is dripping down on the cutlery when you grab the handle.
The Bosch handle with inside water trapping beams.

This is not only an example of bad design but also of inadequate testing. If the Q&A guys in the Bosch laboratory had bothered to test the cutlery basket they would have spotted the problem, and sent it back to the developers.

Part 2.

Last week I had a call from a colleague ‘can you help with an interface, we need to transfer purchase orders from our SAP ERP system to the ERP systems of vendors’.
The vendors didn’t want to use the web portal as such, but preferred to exchange orders and acknowledgements via files. The problem was our ERP system could only communicate via CSV files and the vendors systems expected messages in their XML formats. If you have XML files on both sides a small XSL script is all you need, but our ERP system’s vendor portal can only communicate via CSV files. The year is 2014! I would not have created such an interface.

"Interface Type:","POCONF",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
"Owner Partner:","XY00000025",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
"Selection Profile Number:","00000000000001000045",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
"Selection Profile Name:","PD_CONFIRM_PO",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
"Created By:","XYZ123",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
"Created On:","17.11.2014 10:47:59 CET",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
"PO No.","PO Item No.","Requested","Confirmed","To Be Confirmed","To Be Rejected","Product","RevLvl","Quantity","UoM","Deliv. Date","Deliv.Time","DlvTZ","Ship. Date","Ship. Time","ShipTZ","Requested Price","Confirmed Price","Crcy","PrU.","PrUoM","Reference Document Number of Sales Order","Requested MPN","Confirmed MPN","Requested Mfr","Confirmed Mfr","Customer Loc.","Component ID","Product","RevLvl","Requirement Date","Qty","UoM","Cust. Batch"
"4700000386","20","X","","","","ABC0221300","","2.200","PC","05.01.2015","00:00:00","CET","24.12.2014","23:59:55","CET","4,85","0.000000 ","SEK","1","PC","","","","","","6100","","","","","","",""
A CSV file, SAP interface format.

As you see this is some CSV file! First we have a file header, then two! lines of labels and then purchase order lines. Anyone with experience of parsing CSV files understands this is not the easiest file to parse.
This is not only an example of bad design but also of inadequate testing. If the Q&A guys in the SAP laboratory had eaten their own dog food, they would had sent this back to the developers.


PHP 5.6 Mojibake

The three little piggies have shown their ugly faces and bit me again. The three little piggies are the swedish national characters ÅÄÖ. The three letters with the ring and the dots are more than cute IKEA-dots as I once had them described to me, they are vowels in their own rights. Å is pronounced as the vowel sound in the english word ‘your’, Ä  as the vowel sound in ‘bear’ and Ö as in ‘sir’. These three letters has haunted me in my entire professional career. from EBCDIC in the IBM mainframes and now lastly in PHP 5.6. PHP now has a default encoding setting and this is UTF-8 by default. This is in itself is a good thing, prior to 5.6 character encoding was a mishmash of ISO-8859-1 and UTF-8. I knew of the change, but I didn’t pay attention to it and didn’t test it. I did not even ask someone else to test it. I should have known better. All the Ås, the Äs and the Ös was distorted when we made the switch to 5.6. In our Nantes plant they had managed to put in Äs in some part numbers (material number in SAP english), this caused some  ‘disturbance’ in BOM structures and some other places. How the french guys had managed to use the swedish Ä in part/material numbers is a mystery to me, it might just have been some double encoding error, I do not know.
The remedy for this mistranslation  is simple, just set default_charset=’ISO-8859-1’, but I had not tested it, so I sat the old ICONV( ISO-8859-1, ISO-8859-1, ISO-8859-1) instead, although deprecated in PHP 5.6. I had tested ICONV before and was pretty sure it would work. The right solution is of course to figure out what the problems with UTF-8 are and fix them. Since encoding now is addressed in a clear way in PHP,  it should be simpler to attack the problem, simple it will never be but simpler than before version 5.6.