Tuesday, September 27, 2005

Brands are Related to Patterns

Thanks, to Scott Wiener, here's an old (1991) paper by Richard Gabriel that discusses parallelism in QLisp and a construct called brands. Brands share a lot of philosophical similarity to the way I am seeking to implement support for patterns in Maverick.

If you've never read it, it's quite interesting.

Thursday, September 15, 2005

A technique for applying design patterns to achieve intentional programming goals

This was written in June of 2004. It's the start of a formal look at the stages of design and a little bit of musing on the characteristics and distinctions of formal solutions compared to formal requirements. This train of thought started me down the path that has led to my current experiments with Maverick.

If you start with an instance network, completely described in a uniform syntax, then GoF Design Patterns can be applied as part of a recursive process of redundancy reduction and trend extension.

Starting with an unoptimized design model, there are two basic categories of heuristic that are typically applied:
  • Reduction of redundancy

  • Projection (using discrete examples to anticipate classes)

The premise of this approach is that for any problem, there exists a theoretical model of the problem which is rigorous and discrete (albeit, potentially infinite). Representations which encode the problem as rules or non-discrete abstractions are algorithmic transformations of the discrete model, usually for the above-stated purposes of optimization and generalization.

This approach exploits this premise to construct a methodology for rigorously seeking an optimized design solution to any particular problem by first expressing it as a system of discrete and algorithmic statements and then systematically transforming this system by application of a set of heuristics until a defined set of criteria (quality goals) are satisfied.

One characterization of the steps involved in designing a solution to a problem is:

  • Create a Requirements Model
    • Capture the real-world problem as a set of empirical examples (use cases) and cross-cutting rules (requirements). At this stage consider only rules that describe the problem (functional requirements). Ignore requirements that are concerned with the manner or style in which the problem must be solved (nonfunctional requirements).


    • Normalize the use cases and requirements so that they are consistent with one another and use the same vocabulary and granularity of concept.


  • Create a Design Model
    • Adopt a high-level design language for expressing a model of the problem for which a library of design optimization heuristics exists.


    • Transform the model of the problem described by step 2 into an equivalent model expressed in the chosen design language from step 3.

      • First transform each use case literally and directly from the organic quasi-natural language form taken in step 1 to the rigorous language adopted in step a.


      • Now express each requirement in a rigorous form using the same language.


  • Iterate steps 1-2 until the requirements model, the design model and the real-world problem are verifiably consistent with each other for the cases considered.


  • Establish a set of acceptance criteria (quality goals) for an acceptable design.


  • Optimize the Design Model
    • Establish a set of priorities (design rules) for the application of design heuristics (patterns).


    • Evaluate design model against criteria established in step 4 to determine where the design fails to pass.


    • Apply design heuristics to the design model to transform the model into a more optimized form.


    • Iterate steps b-c until b is satisfied. If b can not be satisfied within a finite number of iterations, decide whether to return to steps a, 4, or 1.


At this point, the Design Model can be translated into an implementation…

So I’ve been looking at formal semantics of programming languages, and their overlap into formal specification of requirements. It’s interesting to note that very little work seems to be more recent than the mid-90’s, yet the principles and notational systems do not appear to be in widespread use today.

Why? What happened?

There are three principle forms of semantic specification:

  • Denotational Semantics

  • Operational Semantics

  • Axiomatic Semantics


I’m still working towards a real understanding, but superficially, it appears that axiomatic semantics is the domain of requirements specification. Denotational Semantics I’m still somewhat unsure of but it figures significantly in formal proofs of program translation.
Operational Semantics appears to be the programming language itself.

There has been work done on axiomatic semantic systems and languages (some of which are directly executable). It is unclear to me why constructive modeling techniques (standard programming) are necessary when formalized axiomatic systems exist. Is it simply a matter of ease and accessibility? (Axiomatic systems are admittedly abstruse and inaccessible, though easy to prove correct for a given problem set.)

This question seems to be at the root of the next major round of thinking.

Tuesday, September 13, 2005

Implementing Composite Types using Aspects

I've been able to implement composite types outside the kernel of Maverick in a very novel way. The facilities provided by the kernel are sufficient to implement even this fundamental capability using the same features any user has access to.

The kernel of Maverick is largely composed of a hierarchical type mechanism with support for extensible attribution, a basic Aspect mechanism, and a base instance implementation that supports both general instances and lvalues.

Even Lvalues are arguably an extension to the kernel, since they are essentially implemented as a specialization aspect of instance that adds the setValue() behavior. However from a user's point of view, lvalues can't be subtypes of instances since constants and variables are both equivalent for type comparison purposes. Therefore the orthogonality of implementation is compromised trivially so that lvalues are type equivalent with instances despite their implementation.

Extending this kernel with even fundamental behaviors isn't too hard. Since even type is an instance, we can add both traditional runtime behaviors and compile-time (runtime for type) behaviors using the same basic mechanisms.

In the particular case of composite types, I added a new subtype called struct which has the runtime initializer behavior overloaded to construct a new instance's value by examining the struct subtype's new type attribute properties to discover a list of the members of the structure. The initializer then creates a value for the instance which is a hash of members and allocates a new instance for each member according to the types found in struct->properties

I also added compiletime/definetime behaviors addMember() and getMember() to the struct type so that we can add members to structs at the time the struct is defined.

Total code? Less than a hundred and fifty lines in a single user-level file.

Thursday, September 01, 2005

iPods and Markov Processes

Only Peter Norvig would apply Markov Processes in order to turn an iPod Shuffle into a more managed and deterministic device.

iPods, Markov Processes and the Martin Shuffle.