Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Text files with different structures and delimiters
Message
 
 
To
08/12/2001 19:55:17
Dragan Nedeljkovich (Online)
Now officially retired
Zrenjanin, Serbia
General information
Forum:
Visual FoxPro
Category:
Coding, syntax & commands
Miscellaneous
Thread ID:
00591251
Message ID:
00591799
Views:
33
>I've done it, and the only problem was the CRs in memos. You'd need to take a good look at the field separators - they should be tabs, but then could be something else. The actual code is not my property, so I'll just give you the synopsis:
>
>Create a table with two fields: the name from the first line, and the name of the field you want to create, and its type/size (make it all character or memo) like
>
>"Street_Address", "cAddress", "c(45)"
>
>and index it on the first field. Read the first line from the import file, and split it by the delimiters. Find each field in this table of yours, if it's not there, invent a dummy field - I used something Fld023 c(10) if I didn't have a match. Build an array as you go - with your field names, and build a string to use in a Create Cursor command. Once you're done with the field names line, create this cursor
>
>create cursor cuImport (&cStructure)
>Now scan the rest of the lines, one by one. I was using low-level file functions, building a buffer string. Because of the embedded carriage returns, I had to count the field delimiters until I was sure I had the whole line in the buffer, and then cut the line into a separate string, and left the remainder in the buffer for the next record. Then I'd convert this one-record string into an array using aLines() - had to strtran() the CRs into something else (like $$ pairs and such) so I wouldn't break the memos, and then strtran() the field separators into CRs. Now the value of each field would be in a member of this array. I'd then parse this array:
>
>append blank
>for i=1 to nNumFields
> replace (aStruArray[i]) with aFieldArray[i]
>endfor
>
>and then I'd loop to the next record in the input file. Sometimes I had to strip the quotation marks or apostrophes from the values, had to restore the CRs from $$ marks etc, but generally that's just massage.
>What you get here is a cursor with all the values as strings. You'd have to run several rounds until you fill this names table with proper structure to accomodate whatever you can get from this import. A good thing would be to also have more fields in this matching table, namely the names of the fields in your real tables, and maybe a name of the conversion function to perform (like ctod() for dates, proper() for names or some UDF for special cases). The second part of the process would make use of this and scan through your cuImport cursor and move the data from it to your proper tables.
>
>Not that I wish to discourage you, but while this sounds simple, there are more speed bumps down the road. You'll discover some repeated values which actually fit some of your lookups, and you'll match them to your lookups... until once you discover they've changed the set of values on you. Since they owe you nothing, they won't tell you. I was maintaining and speeding up this code for three seasons, and it's never been simple. But it's feasible, and if you don't have to do much data massaging, it may be easy.


Hi Dragan,

Sounds like a plan and very complicated one, but I discussed this problem already and found few things:
1) We pay something for the data, but very cheap price
2) We receive this data ~ once in 6 months
3) It's unlikely what data format would be changed, but possible (as it happens this time)

So, I'm going to take a simplified route, as I described in my reply to Winn: check the current table structure with what they have in the first line (if they have this structure there) and if it doesn't correspond, invoke modify structure dialog... I may use your idea of intermediate "conversion" table to make this comparision easier...

Thanks again for the idea.
If it's not broken, fix it until it is.


My Blog
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform