Nice to hear of your success!
What seemed the fly in the ointment for an SQL approach was the fact that you had 3 tables for which outer joins would be required. One or two is rather straightforward to implement. But, not having done 3, I wasn't sure what would be involved, and didn't have the time to experiment.
Sometimes good old Fox DML really comes through %).
Jay
>Just to let you know what happenned.
>
>After many tries on different things I gave up and found that procedural code works great here.
>
>Did several queries to get the data and get a few cursors (using Distinct) to get the available dates within the data, all possible countries and all possible codes. Then I did 3 scan loops:
>
>SCAN
> SCAN
> SCAN
> INSERT INTO
>...
>
>This is extremely fast anyway as the number of records is not that big (the query starts on a table of over 500,000 and ends up with <1000 hits.
>
>Now I have a denormalized table. Then I do a query with Group By and SUM(Trans)
>to avoid any duplicates from the SCAN and the data is ready.
>
>With the help of Tamar and Della's book, this produces now an Excel Pivot Table,
>zipped and emailed automatically with DynaZIP and wwIPStuff. The users receive then a live Pivot Table that they can manipulate and re-use at will. They love it!
>
>>
>>>>>>>>>>>>>>>>>
>>>You may want to try something like this:
>>>
>>>>>>select table1.key, table2.key;
>>>from table1
>>> left outer join table2 on table1.key=table2.key ;
>>><b>having isnull(table1.key)</b>
>>>
>>>
>>>Watch out, I think the fact that HAVING is working in a SQL statement without GROUP BY is a bug in VFP. A statement like this will fail in SQL Server or Oracle.
>>>>>>>>>>>>>>>>>>>