If you can avoid the DataSet altogether you can avoid the problem of loading those 7000 records into memory and then parsing through them.
Use a DataReader and read each record as it comes in from the data base, use the appropriate DataReader types and you'll drastically improve your performance.
May not be possible but anytime you have large numbers of records in a data set you're asking for resource and performance issues.
OTOH, 7000 records is not a lot really.
More:
* Don't compare strings for the type - compare the actual type
* Use ordinal field numbers instead of string field names to go
through the columns. It's an order of magnitude faster on large
data sets MUCH faster. (can't remember there should be an index you can use)
+++ Rick ---
>Is there a way to optimize the speed result on this code:
>
>
> ' Adjust a dataset to avoid null values
> ' expO1 Dataset
> Public Function AdjustDataSetToAvoidNullValue(ByVal toDataSet As DataSet) As DataSet
> Dim ldDate As Date = oApp.GetEmptyDate()
> Dim loColumn As DataColumn
> Dim loRow As DataRow
> Dim loTable As DataTable
>
> For Each loTable In toDataSet.Tables
> For Each loRow In loTable.Rows
> For Each loColumn In loTable.Columns
> If IsDBNull(loRow.Item(loColumn.ColumnName)) Then
>
> Select Case loColumn.DataType.ToString
>
> Case "System.DateTime"
> loRow.Item(loColumn.ColumnName) = ldDate
>
> Case "System.Integer"
> loRow.Item(loColumn.ColumnName) = 0
>
> Case "System.Decimal"
> loRow.Item(loColumn.ColumnName) = 0
>
> Case "System.Boolean"
> loRow.Item(loColumn.ColumnName) = False
>
> Case Else
> loRow.Item(loColumn.ColumnName) = ""
>
> End Select
>
> End If
> Next
> Next
> Next
>
> Return toDataSet
> End Function
>
>
>On a 7000 records dataset, which contains a lot of fields, this process can take up to 4 seconds.