Updated 2014-04-11 00:07:33 by RLE

Not Functional Imaging - Scripting Imaging is a response by Philip Quaife to Functional imaging, which looks at implementing a paradigm found in functional programming language in Tcl.

Description  edit

While all these contributions of Functional imaging are useful in themselves, they also show that Tcl does not always map well to these techniques.

That page also touched on a design pattern that is not well-exploited in scripting languages. I attempt to elaborate here.

Please refer to the original page before reading further.

In this discussion we assume that the example given in the above reference is a valid image processing technique and that the method of x y transformations are appropriate to real world tasks. We also assume that the function would in real life be called more than once in the life time of the application. For example, face recognition, video editing, or such.

Background  edit

The example implements a combinor function to convert a pipeline of processing tasks, as an example:
o {gPaint yellow} gChecker {rippleRad 6 0.2} {swirl 26} toPolars  

This creates a proc of the form:
proc uniquename x {
    gPaint yellow [gChecker [rippleRad 6 0.2 [swirl 26 [toPolars $x]]]]
}

We have here the essence of dynamic programming. In the case of Tcl, the creation of a proc is due to the need to bytecode for speed. Otherwise the pipeline could have been implemented with eval.

The main processing loop iterates through the x, y coordinates and calls the arbitary function with the xy pair. The result is then appended to the new image data.

Extending dynamic programming  edit

If we instead use specialisation to change the main loop we get a performance increase. Take the following template:
proc qfim {f {zoom 100} {width 200} {height -}} {
    puts "func {[info args $f]} -> [info body $f]"
    # produce a photo image by applying function f to pixels
    if {$height eq "-"} {set height $width}
    set im [image create photo -height $height -width $width]
    set data {}
    set xs {}
    set rgb [rgb yellow]
    set rgbw [rgb white]
    for {set j 0} {$j<$width} {incr j} {
        lappend xs [expr {($j-$width/2.)/$zoom}]
    }
    for {set i 0} {$i<$height} {incr i} {
        set row {}
        set y0 [expr {($i-$height/2.)/$zoom}]
        foreach x $xs {
            #@FUNCTION
            #@RESULT
            lappend row $res
        }
        lappend data $row
    }
    $im put $data
    set im
}

We can specialise this proc for the following function:
o {gPaint yellow} gChecker {rippleRad 6 0.2} {swirl 26} toPolars  

We get:
proc uniquefim {f {zoom 100} {width 200} {height -}} {
    puts "func {[info args $f]} -> [info body $f]"
    # produce a photo image by applying function f to pixels
    if {$height=="-"} {set height $width}
    set im [image create photo -height $height -width $width]
    set data {}
    set xs {}
    set rgb [rgb yellow]
    set rgbw [rgb white]
    for {set j 0} {$j<$width} {incr j} {
        lappend xs [expr {($j-$width/2.)/$zoom}]
    }
    for {set i 0} {$i<$height} {incr i} {
        set row {}
        set y0 [expr {($i-$height/2.)/$zoom}]
        foreach x $xs {
            #@FUNCTION
            set y $y0

            # toPolars
            set x1 [expr {hypot($x,$y)}]
            set y [expr {$x||$y? atan2($y,$x): 0}]
            set x $x1

            # swirl 26
            set angle [expr {hypot($x,$y)*6.283185306/26}]

            #rotate
            set x1 [expr {$x*cos(-$angle) - $y*sin(-$angle)}]
            set y [expr {$y*cos(-$angle) + $x*sin(-$angle)}]
            set x $x1

            #rippleRad 6 0.2

            #topolars
            set r [expr {hypot($x,$y)}]
            set a [expr {$x||$y? atan2($y,$x): 0}]

            #fromPolars [list [expr {$r*(1.+$s*sin($n*$a))}] $a]
            set r [expr {$r*(1.+0.2*sin(6*$a))}]

            #fromPolars
            set x [expr {$r*cos($a)}]
            set y [expr {$r*sin($a)}]

            # gchecker
            set greylevel [expr {(fmod(abs($x),1.)*fmod(abs($y),1.))}]
            set hex [format %02X [expr {round($greylevel*255)}]]
            set pixel #$hex$hex$hex

            # gpaint yellow
            set abspixel [lindex [rgb $pixel] 0]
            foreach var {r g b} in $rgb ref $rgbw {
                set $var [expr {round(double($abspixel)*$in/$ref/$ref*255.)}]
            }
            #c2c $r $g $b
            set res [format #%02X%02X%02X $r $g $b]
            #@RESULT
            lappend row $res
        }
        lappend data $row
    }
    $im put $data
    set im
}

**

The passing of the function $f is for debuging only.

Note that the above would have been created by the o proc and not hand-crafted. The new proc would be called to create the image. In testing this change gives a 33% reduction in processing time. By moving constant expressions out of the main loop, we get a speedup of 2 times, which is significant. If some optimisation of the use of expr is done (programaticaly) then it would be expected that a boost of 3 times could be gained over the original example.

If we were to apply specialization to the whole procedure we would also specialize the code for processing the width and height. This would include precreation of the lists for I and J and the removal of the if statement for testing if $height is - as we know what the value is.

I have not demonstrated the code to generate the template because the image processing functions were originally coded as procs and in these example they would instead be implemented as lists so that combination would be a trivial process.

The above said, the use of info body proc would allow the extraction of the code from existing procedures providing that the code can determine what the return values would be and inline them as assignments to x and y.

Further applications  edit

We find a common task for code libraries is one of initialisation. This takes two forms:

  1. first invokation environment creation.
  2. Application specific environment configuration.

Both of these are candidates for specialisation. Consider this example.
proc dbOpen {args} {
    variable Data
    if {![info exists Data(Init)]} {
        dbInit
        dbConnect
        rename unknown _unknown
        proc unknown {args} {dbUnknown $args}
    }
    ...
}

To remove the need for an application-specific call to initialise the environment for the module, the main API function will do this automatically. The downside of this is that every call to dbOpen has the overhead of checking if this is the first call.

The new approach:
proc real_dbOpen {args} {
    variable Data
     ...
}

proc dbOpen {args} {
    dbInit
    dbConnect
    rename unknown _unknown
    proc unknown {args} {dbUnknown $args}
    rename dbOpen {}
    rename real_dbOpen dbOpen
    uplevel dbOpen $args
}

Here we incur the overhead once and then no checks are ever made on subsequent calls to dbOpen.

An alternate that removes the need for two proceedures is:
proc dbOpen {args} {
    #@INIT
    dbInit
    dbConnect
    rename unknown _unknown
    proc unknown {args} {dbUnknown $args}
    set body [info body dbOpen]
    regsub {#@INIT.*#@DONE} $body {} body
    proc dbOpen [info args dbOpen] $body
    #@DONE
    variable Data
     ...
}

This will work well with autoloading and other packaging systems.

Conclusion  edit

Scripting languages are always handicapped for speed due to the nature of the language. Any technique that can boost speed can only be good.

The classic example (IMHO) is Stephen Uhler's html_library where data is the program. This should be required reading for all Tcl newbies. Scripting is more than just an interpretered vs compiled paradigm. The ability to change the code during execution is a fundamental concept then needs to be demonstrated more.

jcw: You said "... If we instead use specialisation to change the main loop we get a performance increase ...". What sort of performance effects do the above techniques have versus parametrization through procs, evals, and variable expansion?

PWQ: I am not sure I understand the question so I'll take them literally. Specialisation removes parameters. I did not show this, but taken further there would be a function called uniquefiq_z100_h200_w200_filter... proc that would need no arguments and has no conditionals in the body.

Evals are to be avoided IMO as they both introduce complications with quoting arguments and also are inefficient. Evals would be the opposite of specialisation. The best form of specialisation would be aliases.

Another example, take the above dbOpen case. Info exists is a classic case, we must processes the first argument, exists, to work out what processing must be performed. Every time the code executes the argument never changes but we must always do the test.

Imagine that dbOpen was used in a web server that was to serve 100,000 pages per day. While info exists does not take a lot of time, in the above application it would be a significant part of the processing. So if we can either specialise the call to info exists or in the above case remove it completely then we have significant performance gains. Other obvious candidates are the switch statement as well as fixed arguments to procs (ie regexp -nocase).

Take another technique as an example, consider this web server example code:
[gets $socket]

This looks like a mistake?, but if we leverage off unknown and aliases and then create a specialist proc (eg proc {GET http://host/index.htm HTTP/1.0} {socket} {...}) then we can cache page requests without any conditional logic to slow the process.

The question I would ask is this: How many programmers would think of programming using this technique, vs the standard way it would be done in other languages by passing the request as an arguement to a generic procedure?

This probably does not answer your question, please give examples to clarify.

jcw - Philip, I see it's me who needs to think more before asking here. I'll re-read to better understand what you're after...

RS: Well, functional imaging was a Tcl remake of a research project, and I tried to follow the Haskell original as closely as possible, and they just used functional composition. Also, the creation of procs isn't very expensive in my opinion, and has the advantage that all available combinations can be introspected with just
info procs {o *}

and so the button bar was easily created. (Also, I think Tcl lends itself pretty perfectly to functional programming...)

PWQ: I hope you did not take these comments as disparaging remarks about Functional imaging, There is no ideal way to program and I think it is important to look at how other patterns and techniques could be used in Tcl.

However since there will be a large body of novice users that take code from places such as this wiki, I feel that both sides of the coin as it were be represented.

As an example, we can show how a bubble sort could be implemented in Tcl. But by doing this, a large body of programmers will never know that there are more efficient methods of sorting. And possibly that there is an inbuilt sort command that can be leveraged.

RS: Ah, your reply to jcw above made your idea of specialization a bit clearer to me. I fully agree that redundant checks bloat code and make it less readable - simpler is better :) And getting simplicity without sacrificing robustness even more so. One (optical) example of simplification in Tcl is currying (Custom curry), but it works only for single commands, whose arguments may be augmented in front (and costs one name lookup more, so isn't faster:
interp alias {} mySpecialCase {} myGenericProc $width $height ...

while you advocate a more elaborate way of building proc bodies from templates, with the cases taken care of (by string map) at generation time, rather than checking at runtime. Sounds good. Will have to think..