patterncsharpMinor
Insert multiple rows into a SQL table
Viewed 0 times
rowsinsertsqlintomultipletable
Problem
I've written the following C# code to insert rows (that a user enters from a web application) into a SQL database. How does this code look? Is there a better, more efficient way to accomplish the task?
As you can see, I am looping to add db parameters to hold the value for each row. I think that there may be a better way to do this.
public void UpdateDeviceStatus(string[] deviceId, byte[] status, string userId, string[] remarks, DateTime dateTurnin)
{
if (deviceId.Length != status.Length || status.Length != remarks.Length)
throw new ArgumentOutOfRangeException("Invalid arguments passed to UpdateDeviceStatus: deviceId, status, and remarks must contain the same number of entries.");
if (deviceId.Length == 0)
throw new ArgumentOutOfRangeException("UpdateDeviceStatus expects to update status for at least one deviceId, but an empty array was passed in.");
// Build the SQL statement
StringBuilder sbSql = new StringBuilder();
sbSql.Append(@"INSERT INTO AT_Event_History(deviceId,parentCode,statusCode,remarks,userId,whenEntered) VALUES");
for (int i = 0; i ("AssetTrackConnection");
DbCommand command = db.GetSqlStringCommand(sql);
command.CommandType = CommandType.Text;
command.CommandText = sql;
// Add in parameters
db.AddInParameter(command, "@userId", DbType.AnsiString, userId);
db.AddInParameter(command, "@whenEntered", DbType.Date, dateTurnin);
for (int j = 0; j < deviceId.Length; j++)
{
db.AddInParameter(command, "@deviceId" + j.ToString(), DbType.Guid, new Guid(deviceId[j]));
db.AddInParameter(command, "@statusCode" + j.ToString(), DbType.Byte, status[j]);
db.AddInParameter(command, "@remarks" + j.ToString(), DbType.AnsiString, remarks[j]);
}
// Execute the statement.
db.ExecuteNonQuery(command);
}As you can see, I am looping to add db parameters to hold the value for each row. I think that there may be a better way to do this.
Solution
You are building a dynamic insert statement with parametrized values. This works and there's nothing wrong with this method. It may even be the best method for your circumstance. It works well when your table is "small". I have a rather large database which grows monotonically. We keep adding rows and never remove any. When your table grows beyond a certain point that is specific to your table design, inserts get very slow. At this point, you will want to consider using the SqlBulkCopy class. You insert the values into a DataTable and then do a bulk insert. This is much faster because it uses a SQL Server specific method of loading data faster. The SqlBulkCopy link has sample code.
Context
StackExchange Code Review Q#3184, answer score: 7
Revisions (0)
No revisions yet.