HiveBrain v1.2.0
Get Started
← Back to all entries
patternModerate

What piece am I missing to turn this idea into a programming language?

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
turnthisideawhatpieceintoprogramminglanguagemissing

Problem

I've been doing some reading (I'll name drop along the way) and have selected a few scattered ideas that I think could be cobbled together into a nifty esoteric programming language. But I'm having some difficulty assembling the parts.

Kleene's Theorem states: Any Regular Set can be recognized by some Finite-State Machine (Minsky 4.3).

Minsky's Theorem 3.5: Every Finite-State machine is equivalent to, and can be "simulated by", some neural net.

"There is a natural way to represent any forest as a binary tree." (Knuth, v1, 333).

And according to Bentley (Programming Pearls, p.126) a binary tree can be encoded as a flat array.

So I'm imagining an array of bit-fields (say 4 bits so it can easily be worked with in hexadecimal). Each field indicates a type of automaton, and the positions of the array encode (via an intermediary binary tree representation) a forest which approximates (? missing piece ?) the power of a graph.

I'm somewhat bewildered by the possibilities of automaton sets to try, and of course the fun Universal Automata require three inputs (I worked up an algorithm inspired by Bentley to encode a ternary tree implicitly in a flat array, but it feels like the wrong direction). So I'd appreciate any side-bar guidance on that. Current best idea: the normal set: and or xor not nand nor, with remaining bits used for threshold weights on the inputs.

So the big piece I'm missing is a formalism for applying one of these nibble-strings to a datum. Any ideas or related research I should look into?

Edit: My theoretical support suggests that the type of computations will probably be limited to RL acceptors (and maybe generators, but I haven't thought that through).

So, I tried to find an example to flesh this out. The C int isdigit(int c) function performs a logical computation on (in effect) a bit-string. Assuming ASCII, where the valid digits are 0x30 0x31 0x32 0x33 0x34 0x35 0x36 0x37 0x38 0x39, so bit 7 must be off, bit 6 must be off, bit 5 must be on

Solution

This sounds more like the makings of a computational model rather than a programming language, as such, perhaps in the same way that the quantum computation can form the basis of a programming language such as the quantum lambda calculus.

Ask yourself:

  • What kinds of computation are you trying to perform?



  • How can these computations be composed?



  • How can the results of these computations be stored? How can they be used in subsequent computations?



  • How can I represent this syntactically?



  • Does my chosen syntax have a clear denotational meaning? Or clear operational meaning?



  • Do my syntactic constructs compose nicely? Are they sufficiently orthogonal and free from arbitrary constraints?



  • Can I modularise computations in my language?



  • What forms of abstraction does my language offer?



There are a number of places you could use as a starting point:

  • The imperative language WHILE --- a simple language with variables, assignment, sequencing, if statements, and while statements. Maybe later add procedures.



  • The lambda calculus.



  • Combinatorial logic. Programs consist of combinators which are plugged together. No variables.



  • Something like Prolog or Datalog.



Or you could be completely wild and ignore these and see how far you get (which could result in something interesting).

Context

StackExchange Computer Science Q#4618, answer score: 12

Revisions (0)

No revisions yet.