Monthly Archives: December 2014

6 questions to answer

These are questions to be resolved because my vision for Andl depends on them.  The questions are: how to build on the work of TTM and TD to better address the ability to do:

  1. Complex aggregation (including things like
    1. statistical variance
    2. running sum
    3. aggregations across ‘outer joins’.
  2. Deep ‘self joins’, that is relations that form graphs and trees (and including aggregation on them, such as pricing a bill of materials).
  3. Deep nesting, that is relations that are not in first normal form, with RVAs (relation valued attributes) that in turn include RVAs to form trees and graphs in a single relation.
  4. Relational (joinable) functions.
  5. Relational Updates (not so SQL-like).
  6. Quantifiable restriction (equivalents for SQL TOP/LIMIT; list parts obtainable from at least 2 suppliers)

I think the ground work has been laid already, and all I need to do is to draw it together.

Leave a Comment

Filed under Rationale

Out of the tar pit

A really good paper. Definitely worth a read. See here: http://shaffner.us/cs/papers/tarpit.pdf. Ben Moseley and Peter Marks.

Complexity is the single major difficulty in the successful development of large-scale software systems. Following Brooks we distinguish accidental from essential difficulty, but disagree with his premise that most complexity remaining in contemporary systems is essential. We identify common causes of complexity and discuss general approaches which can be taken to eliminate them where they are accidental in nature. To make things more concrete we then give an outline for a potential complexity-minimizing approach based on functional programming and Codd’s relational model of data.

There is nothing in the paper that should have surprised a relational audience. Indeed they might have sat back and quietly nodded or applauded at the appropriate places. Mind you, it’s pretty basic stuff and lacks higher order operations.

There was quite a bit to puzzle or even offend a TTM advocate, in the comments about a suitable language. The type system, the structure and even the scope and purpose of the language would be hard to reconcile with D. Some of these are mentioned on p63.

I saw a Feeder as a way to obtain data from a non-relational source, which will necessarily result in the execution of an INSERT/UPDATE/DELETE operation. An Observer would be a way for external logic to execute as a consequence of a change in relational state (the issue of trigger vs polling is unimportant). The result could be as simple as updating a screen display or something more complex like sending an email or synchronising with another system. These are very MVC-like concepts.

My main disappointment with the paper is that the (hoped for) final section is missing or severely truncated. The expose of problems is excellent as far as it goes, but the fragments of concrete solution presented are unsatisfying. This paper got some attention here: http://lambda-the-ultimate.org/node/1446. Moseley released some source code, but does not seem to have worked on this much since about 2006. See: https://groups.google.com/forum/?fromgroups#!topic/frp-discuss/BNmBgtqRUFY.

In a nutshell, he captures what I would like to do, but doesn’t help all that much with solving the problem of how to do it.

Leave a Comment

Filed under Rationale

The true calling of ‘D’

Much of TTM and related writings deals with what’s wrong with SQL and how it should be done better. SQL is an essential part of the communications between applications and databases, in conjunction with an ORM of some kind. Which kind of points to a similar role for the language D. At least that’s the way it has seemed to me.

My question is: would it be better to think of D not so much as a replacement for SQL as the language in which to code an application data model?

Using a slightly modified version of my 4 layers, and just thinking about a modest web app:

  1. UI access: coded in HTML, CSS and JS.
  2. Glue code: written in GP language: Java, C#, ruby, etc
  3. Data model: coded in GP language.
  4. DBMS access: coded in GP language and SQL.

By data model here I mean the totality of the state of the business data that models the application, both transient and persistent. The idea would be to code layer 3 entirely in a suitable D. It would draw together data from a variety of sources, and is free to use SQL and a DBMS for persisting or retrieving data.

And that leaves me pondering two questions.

  1. What specific D features are required to fully implement a data model? Scoped constructs like functions or modules are vital, I think.
  2. What should the API between application and data model look like? POCOs rather than relvars I think.

All of a sudden this sounds like a bigger project.

Leave a Comment

Filed under Rationale, TTM