patternMinor
Is it possible to make a language that can build upon itself perfectly?
Viewed 0 times
itselfperfectlycanuponmakelanguagepossiblethatbuild
Problem
First of all, note that I'll have to explain my thoughts in a layman's terms.
There are so many high-level programming languages out there that compete with each other. This means we have to build the same functionality over and over again in each language.
Therefore, I wonder the following:
Why isn't there one language that has come to dominate because it's the most efficient and the most robust, being built from the bottom up perfectly?
Of course, 'perfectly' isn't well defined, so let me try to explain what I mean by this.
As far as I'm aware, all operations on computers boil down to working with 1's and 0's. This makes me think that there's a better way to do things.
For example, take ASCII. Why is ASCII the way it is? It's arbitrary that M is
First of all, am I way off base on this intuition? Secondly, let me provide an idea of what I think could at least partially solve this issue.
Imagine I had a functional language that took bits, transformed them in every way possible, and returned bits. I can only think of what this might look like with the constraint that the bits remain the same length. for example, it's functionality (higher level pseudocode) would look something like this:
```
proc get_transform_list(bit_list, desired_bit_list):
transform_list = []
current_bit_list = bit_list
for i in len(bit_list)^2:
affected_indices = get_list_of_one_indices_from_number(i) // returns list of 'on' indices in binary representation of i
wrong_indicies = get_list_of_bad_indicies(current_bit_list, desired_bit_list) // returns list of indices in current_bit_list that don't match the desired_bit_list
if {any wrong_indices match any affected_indices}:
transform_list.append(1)
There are so many high-level programming languages out there that compete with each other. This means we have to build the same functionality over and over again in each language.
Therefore, I wonder the following:
Why isn't there one language that has come to dominate because it's the most efficient and the most robust, being built from the bottom up perfectly?
Of course, 'perfectly' isn't well defined, so let me try to explain what I mean by this.
As far as I'm aware, all operations on computers boil down to working with 1's and 0's. This makes me think that there's a better way to do things.
For example, take ASCII. Why is ASCII the way it is? It's arbitrary that M is
01101101. This arbitrariness seems like it's the root of the problem; that if M's representation was determined by how and where M's manifested themselves in our data, that the processing language would be a common language, instead of ASCII.First of all, am I way off base on this intuition? Secondly, let me provide an idea of what I think could at least partially solve this issue.
Imagine I had a functional language that took bits, transformed them in every way possible, and returned bits. I can only think of what this might look like with the constraint that the bits remain the same length. for example, it's functionality (higher level pseudocode) would look something like this:
```
proc get_transform_list(bit_list, desired_bit_list):
transform_list = []
current_bit_list = bit_list
for i in len(bit_list)^2:
affected_indices = get_list_of_one_indices_from_number(i) // returns list of 'on' indices in binary representation of i
wrong_indicies = get_list_of_bad_indicies(current_bit_list, desired_bit_list) // returns list of indices in current_bit_list that don't match the desired_bit_list
if {any wrong_indices match any affected_indices}:
transform_list.append(1)
Solution
Interesting that you reflect on this issue. This is very similar to the issues that I was reflecting on when I started my Phd research back in 1976.
Back then Extensible Languages were very in vogue. The thesis then was that there must be some core of semantic and syntactic elements that could be used as the basis for all other languages. If we constructed the one Master Language it could then be extended to become anything we wanted. All that was needed was some understanding of the atomic units of semantics. All we had to do was start at a language and extend backwards to the atomic indivisible components that could not be further subdivided. This would be the basis of the one universal language which could both express all languages and replace them.
Work on ultimate semantics also led in the of the one machine instruction that could replace all others. The ultimate RISC machine.
As we don't yet have an answer, some are still continuing to reflect.
Back then Extensible Languages were very in vogue. The thesis then was that there must be some core of semantic and syntactic elements that could be used as the basis for all other languages. If we constructed the one Master Language it could then be extended to become anything we wanted. All that was needed was some understanding of the atomic units of semantics. All we had to do was start at a language and extend backwards to the atomic indivisible components that could not be further subdivided. This would be the basis of the one universal language which could both express all languages and replace them.
Work on ultimate semantics also led in the of the one machine instruction that could replace all others. The ultimate RISC machine.
As we don't yet have an answer, some are still continuing to reflect.
Context
StackExchange Computer Science Q#90229, answer score: 8
Revisions (0)
No revisions yet.