>>>>I have to import a discogs 21GB xml data file.
>>>>
>>>>Someone wrote a library that uses the API to read files larger than 2Gb.
>>>
>>>IAC use a streaming parser, not a DOM based one for such huge amounts. In your shoes I'd look at Java / SAX based solution or split the large file into bettter digestable slurps of a few hundred MB with script file API if the XML can be easily parted.
>>
>>I am writing in VFP,
>>using vfp2c32.fll to read the file.
>
>Hi Fabio,
>
>You can use the Windows FileSystem object, that do not have the VFP 2GB limitation:
>
>
>FSO = CreateObject("Scripting.FileSystemObject")
>SET MEMOWIDTH TO 180
>oTS = FSO.OpenTextFile('C:\DESA\foxbin2prg\pruebas varias\Fidel\zensayo2.sc2')
>DO WHILE NOT oTS.AtEndOfStream
> cc = oTS.ReadLine()
> * do something with cc
>ENDDO
>oTS.Close()
>
>
>You can read "lines" (ending with CR/LF) or you can read fixed-length text.
>The hard part is parsing the text. May be Thomas suggestion is a better approach for this.
>
>Best Regards.-
I think the biggest problem with big XML files is that your program must match the first tag(s) with the last tags(). This means that the program must use a lot of memory or temp file storage just to keep track of all the matching tags. It's not a simple line-by-line solution.