patterncMinor
Scalability of running commands from user input
Viewed 0 times
scalabilitycommandsuserinputrunningfrom
Problem
Here is some code I have that has been extracted and shrunk down from a project of mine.
When run with some given input, it runs the method associated with that input. Since this is a scaled down version of my project, it can only run two commands.
Right now, my implementation should run with \$ O \left( n\right)\$ time complexity. This would be unacceptable with the hundreds of thousands of commands my project could scale to. I am looking for reviews on scalability and optimization, and how well my Command structure and function pointer template would scale as well.
#include
#include
#include
#include
#define streq(x, y) (strcmp((x), (y)) == 0)
#define ARRAY_SIZE(x) (sizeof(x)/sizeof(x[0]))
typedef struct
{
const char *cmd;
void (fn)(void);
} __attribute__((__packed__)) Command;
void* getTime(void)
{
return ((void*)((uintptr_t)time(NULL)));
}
void* getDay(void)
{
time_t t = time(NULL);
struct tm tm = *localtime(&t);
return ((void*)((uintptr_t)(tm.tm_wday)));
}
static Command commands[] =
{
{"time", getTime},
{"day", getDay},
};
int main(int argc, char *argv[])
{
for (int i = 0; i cmd)) printf("%d\n", (int)p->fn());
}
}
When run with some given input, it runs the method associated with that input. Since this is a scaled down version of my project, it can only run two commands.
$ ./test-command day
4
Right now, my implementation should run with \$ O \left( n\right)\$ time complexity. This would be unacceptable with the hundreds of thousands of commands my project could scale to. I am looking for reviews on scalability and optimization, and how well my Command structure and function pointer template would scale as well.
Solution
As requested.
An unstructured array may only be searched in linear order. To get better asymptotics you need a more search-friendly structure. A simplest solution would be to presort the
For hundreds of thousands of commands the binary search would complete in about 20 lookups. If this is unacceptable, and your commands' names let you come up with a perfect hash function, hash is a way to go. Beware that a hash function must be really perfect - that is, almost no conflicts and almost no misses. Otherwise, the asymptotic constant would defeat all the advantages of O(1).
Update: assuming a unixish environment, the best way to sort at the build time is to have a
with a
and a Makefile rule saying something like (may require some backslashes)
An unstructured array may only be searched in linear order. To get better asymptotics you need a more search-friendly structure. A simplest solution would be to presort the
Command array, and apply a binary search to find the command. A definite advantage here is a possibility to use standard library. You may even have the array sorted during build time.For hundreds of thousands of commands the binary search would complete in about 20 lookups. If this is unacceptable, and your commands' names let you come up with a perfect hash function, hash is a way to go. Beware that a hash function must be really perfect - that is, almost no conflicts and almost no misses. Otherwise, the asymptotic constant would defeat all the advantages of O(1).
Update: assuming a unixish environment, the best way to sort at the build time is to have a
commands.c file asstatic Commands commands[] = {
#include "command-list.c"
};with a
command-list.txt text file"time", getTime
"day", getDayand a Makefile rule saying something like (may require some backslashes)
%.c: %.txt
sort $ $@Code Snippets
static Commands commands[] = {
#include "command-list.c"
};"time", getTime
"day", getDay%.c: %.txt
sort $< | sed -e 's/.*/{&},/' > $@Context
StackExchange Code Review Q#54112, answer score: 4
Revisions (0)
No revisions yet.