HiveBrain v1.2.0
Get Started
← Back to all entries
patternsqlMinor

How much could sequential scan gain by GPU?

Submitted by: @import:stackexchange-dba··
0
Viewed 0 times
muchscangpugaincouldsequentialhow

Problem

I am considering hiring someone to implement GPU processing in Postgres to speed up sequential scans on a database located on a ramdisk. How much do you believe I could gain using this approach? I could buy any type of graphic card. I know that this question is really hard, but I know very little about GPU, so a very rough guess would help me a lot.

EDIT: Even though I my interest is specifically for Postgres, I could ask this question in a different way: "How large is the potential performance increase by using GPU for sequential scans on an in-memory database". If the answer is that it has a potential of 10-50 times the performance, then I can start investigating more specifically if this could be realized also in Postgres.

Solution

You'd better ask this question on the hackers mailing list, there has been some debate about this subject in the past. The PostgreSQL developers can tell you much more about it.

http://archives.postgresql.org/pgsql-hackers/

To get good performance using a GPU for PostgreSQL, you need some form of parallel processing in PostgreSQL. This is something PostgreSQL doesn't have at the moment (for CPU nor GPU), but Greenplum already implemented this in their PostgreSQL-product for CPU's. Parallel processing is on the ToDo list, but it looks like nobody is working on it at the moment. Ask the PostgreSQL hackers about the current status.

Context

StackExchange Database Administrators Q#2786, answer score: 3

Revisions (0)

No revisions yet.