HiveBrain v1.2.0
Get Started
← Back to all entries
patternMajor

Are algorithms (and efficiency in general) getting less important?

Submitted by: @import:stackexchange-cs··
0
Viewed 0 times
aregettinggeneralalgorithmslessimportantandefficiency

Problem

Since buying computation power is much affordable than in the past, are the knowledge of algorithms and being efficient getting less important? It's clear that you would want to avoid an infinite loop, so, not everything goes. But if you have better hardware, could you have somehow worse software?

Solution

On the contrary. At the same time that hardware is getting cheaper, several other developments take place.

First, the amount of data to be processed is growing exponentially. This has led to the study of quasilinear time algorithms, and the area of big data. Think for example about search engines - they have to handle large volumes of queries, process large amounts of data, and do it quickly. Algorithms are more important than ever.

Second, the area of machine learning is growing strong, and full of algorithms (albeit of a different kind than what you learn in your BA). The area is thriving, and every so often a truly new algorithm is invented, and improves performance significantly.

Third, distributed algorithms have become more important, since we are hitting a roadblock in increasing CPU processing speed. Nowadays computing power is being increased by parallelizing, and that involves dedicated algorithms.

Fourth, to counterbalance the increasing power of CPUs, modern programming paradigms employ virtual machine methods to combat security loopholes. That slows these programs down by an appreciable factor. Adding to the conundrum, your operating system is investing more CPU time on bells and whistles, leaving less CPU time for your actual programs, which could include CPU-intensive algorithms such as video compression and decompression. So while hardware is faster, it's not used as efficiently.

Summarizing, efficient algorithms are necessary to handle large amounts of data; new kinds of algorithms are popping up in the area of artificial intelligence; distributed algorithms are coming into focus; and CPU power is harnessed less efficiently for various reasons (but mainly, because computers are getting more powerful). Algorithms are not dead yet.

Context

StackExchange Computer Science Q#15017, answer score: 36

Revisions (0)

No revisions yet.