Importing delimited data convoluted with commas

I have a log file that has 155 columns with about 13000 rows. Typically no problem.

Here is the twist. The data is tab delimited, but the numeric values greater than 1000 has a comma in it, e.g. 1312 is written as 1,312. How do I bring this data in and rid the damn commas.
What I have tried thus far:
1. Use only tab delimiters in the tweaks dialog and "treat all columns as numeric. Does not work only gives me the data before the column.

2. Load as a notebook and do a search and replace on comma. Then re-save data and open again Beachballs Igor and I must force quit.

Any other hints.
My approach to such problems is to write a "cleaned up" version of the file and then load the cleaned up version.

See http://www.igorexchange.com/node/856 for an example of writing a cleaned up version. I think you could use it if you changed this:
    if (ch<0x80 && (ch>=0x20 || ch==0x0D))

to this:
    if (ch<0x80 && (ch>=0x20 || ch==0x0D) && ch!=0x2C)  // Accept CR, reject comma


This assumes that your current file uses CR or CRLF terminators. If it uses LF:
    if (ch<0x80 && (ch>=0x20 || ch==0x0A) && ch!=0x2C)  // Accept LF, reject comma


Then load the cleaned up version.

I typically write a function that creates the cleaned up version, loads it using LoadWave, and deletes it using DeleteFile.
Hi Howard,

Thank you for the fast reply. I tried your solution and it almost works. It seems to strip out the tab delimiters.

How should the test in the if statement be modified to keep the tabs?

Andy
[quote=hegedus]I tried your solution and it almost works. It seems to strip out the tab delimiters.
How should the test in the if statement be modified to keep the tabs?
Andy[/quote]

Oops.

Try this:
    if (ch<0x80 && (ch>=0x20 || ch==0x09 || ch==0x0D) && ch!=0x2C)  // Accept tab and CR, reject comma