From: Bill von Valtier
Bill von Valtier
To: Jamie Osborn
Bill von Valtier
General VFP Topics
Monday, September 30, 2002 5:53:17 AM
This message was rated by:
> > Hello friends,
> > Here is an interesting problem. I need to append (into a new table) data in an ASCII text file which has many records which are broken into two segments, because they are longer than 256 characters, apparently the maximum length of a DOS text file line. The base text file has 9600 records of 38 fields each, of which about 600 are > 256 chrs in length, and thus are continued as a separated next record. This glitch does "very bad things" to a simple APPEND FROM
. So, the question is, is there some direct way to do this without having to go the hard way and 1) test each text record for whether it's broken or not; 2) if broken, reconstruct it by adding the next record to it; 3) then, since this string cannot be APPENDed (or can it?), it will have to parsed, field-by-field as MEMVARs and INSERTed into the dbf file.
> > There's gotta be a better way! Any ideas?
> > Thanks.
> > BillvV
> > Rochester, Michigan
> You'll have to go line by line I'm afraid. Nothing wrong with some good old-fashioned string parsing !!!!!
Thanks for the input, Jamie. It confirms what I thought.
As it turns out I went ahead and bit the bullet to parse the file line by line, and it wan't nearly as horrendous as I at first thought. It required only 22 additional lines of code. Already having the 38-field new file structure made up, the rest was pretty easy, detecting split records, then using the AFIELDS() function and reading each MEMVAR in a fairly simple loop. Not as fast as the APPEND, but did the entire 9200 records x 38 fields in about 5 seconds. Bless VFP. My next book: "Old fashioned string parsing can be fun."
Text file import problem
Posted by Bill von Valtier @ 9/29/2002 6:49:07 AM
RE: Text file import problem
Posted by Jamie Osborn @ 9/30/2002 1:46:44 AM
Posted by Bill von Valtier @ 9/30/2002 5:53:17 AM
Posted by Jamie Osborn @ 9/30/2002 6:11:41 AM