Friday, September 10, 2010

QVPR as a directed graph - part 2

The QlikView 9 management console does not show complex dependencies between tasks , triggers, source documents. Also when deleting tasks there might be "orphaned" categories etc, not visible otherwise.

Let's render it with old and reliable Graphviz. So how does a big graph look like? For example like this (with names removed, due to usual non disclosure agreements)

In case you wonder why it looks small, you probably are using Internet Explorer and it's built-in Adobe SVG viewer. Those lack scrollbars when rendering SVG files, but one can right click and "zoom out" few times. Highly recommend to use any normal browser capable of rendering SVG such as Firefox, or Safari, or Chrome.

-Alex

Tuesday, September 7, 2010

reload metadata


One is able to parse the *.qvw.log files for statements loading from database tables, Excel files or reading / writing to QVD files. again, rendering with Graphviz gives a nice directed graph





QVPR as a directed graph

It takes 11 clicks to get QEMC open the “indented” task dependency. And you still don’t see it properly ..



Wouldn’t a directed graph be easier to understand?

The one bellow is made by extracting task dependencies from Trigger.xml , task names from ExternalProgramTask.XML, PauseTask.xml, DocumentTask.xml; document names from Sourcedocument.xml; categories from Category.xml . Rendering with Graphviz .

It looks as there are 3 reload tasks for same document. Adding also source document confirms it.




Tasks bellow do the same thing. It is easier to replace two external tasks with one (and make the calls from the batch file)

Adding the source documents will not create a mess.


Thursday, March 4, 2010

QlikView : memory limits

My client has several large machines ( 128+ Gb memory, 16+ processor cores) for developer use. There can be tens of instances of qv.exe developer tool running fine at the same time.

That is, until somebody makes a loop in the script, and loads several millions rows, and consumes all available virtual memory. In that case programs (and Windows services) will randomly fail with variations of "out of memory" and "not enough resources". In rare cases memory shortage is so bad that Windows will corrupt disk and registry .

Here is a very good remedy, inspired from Unix "ulimit": kill all processes consuming more than X amount of memory. Of course it is hard to say a correct limit. With current amounts of data, well behaved documents use up 30 Gb memory, and bad scripts consume in excess of 100 Gb memory and crash. There must be a sanity limit somewhere in between.

powershell -command "& {get-process qv | Where-Object {$_.PeakVirtualMemorySize64 -gt 50000000000 } | Stop-Process } "

Powershell is not particularly fast, but other command line tools (ex from Sysinternals) do not display correctly memory usage for 64 bit processes.

Script above is scheduled every 10 minutes, and has improved a lot the availability of the servers. Of course, here and there some developer starts yelling, but so far there was always something wrong with those QVW files. My favourite so far: Export 50 million rows (with calculated columns) to Excel ..

-Alex