Importing Qlikview logs into MySQL

I have for a very long time tried to come up with ways to measure and compare Business Intelligence Activities. So far I have not come up with something that holds water. It is complicated just to measure the activity in one BI system alone, compare two different systems are even more complex. What I have in mind is to capture some figure of all activities in the Data Warehouse, this will at least give an indication of the use of the Data Warehouse.
The stats I capture today is batch jobs and MySQL queries. This is just an indication of the activities it does not give any hint of the quality or the value of the system.
Qlikview is becoming more and more popular among the Data Warehouse users as a viewer, so I felt it is appropriate to include Qlikview in the overall activities of the Data Warehouse. And this is what this post is about.

When I created The Data Warehouse Movie I had a hard time to parse the QV log, the columns in the log were not separated, it was just space in between and column entries could include space and missing entries was just missing. This time I hex displayed the log and I found columns were separated by hex ‘09’ a whitespace (tab) character, much simpler to parse the logs with that knowledge. The next hurdle the Data Warehouse runs on Linux, Qlikview runs on Windows, I do not want to set up any procedures on the Qlikview server, but decided to grab the log files via a CIFS mount. I created this ITL procedure:
I’m very happy with this procedure, when I started I thought it would be very hard to import the logs, this is a walk in the park, kids play!
The second <action> tag in the <init> section specifies what logs should be imported, by specifying:
<action sync='yes' cmd='ls @WINMNT/Sessions_SSCSSEQVS002_2014-*.log > @J_DIR/logs.txt' dir='@J_DIR'/>
I downloaded all Qlikview session logs from 2014 in one go, it took some 140 seconds.
As it is setup now I will schedule this job at 01:00:00 and import yesterday’s logs.

I still like the Integration Tag Language, it’s simple, succinct and does the job. It should not be hard to read and understand the procedure.
The task that took the longer time was to define the MySQL table:

 (`ExeType` char(5),
 `ExeVersion` char(20),
 `ServerStarted` timestamp,
 `Timestamp` timestamp,
 `Document` varchar(200),
 `DocumentTimestamp` timestamp,
 `QlikViewUser` char(12),
 `ExitReason` varchar(64),
 `SessionStart` timestamp,
 `SessionDuration` time,
 `CPU` int unsigned,
 `BytesReceived` int unsigned,
 `BytesSent` int unsigned,
 `Calls` int unsigned,
 `Selections` int unsigned,
 `AuthenticatedUser` varchar(30),
 `IdentifyingUser` varchar(30),
 `ClientMachine` char(56),
 `SerialNumber` varchar(32),
 `ClientType` varchar(64),
 `ClientVersion` char(10),
 `SecureProtocol` char(3),
 `TunnelProtocol` char(3),
 `ServerPort` int unsigned,
 `ClientAddress` int unsigned,
 `ClientPort` int unsigned,
 `CalType` char(16),
 `CalUsageCount` varchar(25),
 Primary key (`SessionStart` , `AuthenticatedUser`)
I trial&error the table definition a few times until loading was OK. If you happen to know the proper table definition please drop me a line.

Now you may say ‘This seems to be a bit awkward, why download Qlikview logs to MySQL, why not use Qlikview?’. That is a good question, we already have a Qlikview app for the logs, but I have the Data Warehouse twittering app written in ITL and all the other stats in MySQL, so I thought it would be nice to have the Qlikview stats in MySQL also.
If and when I have verified the downloaded data and implemented it in some app I probably write a second post.

No comments:

Post a Comment