Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
10,000 page report
Message
General information
Forum:
Visual FoxPro
Category:
Reports & Report designer
Miscellaneous
Thread ID:
00751116
Message ID:
00752578
Views:
22
This message has been marked as a message which has helped to the initial question of the thread.
I did a lot of work for a water utility (using FoxPro 2.6 for Mac waaaay back when) and did a lot of delinquent notice generation. 500 or 10,000 or 20,000 notices was not unusual. Kind of like what you had, each report "record" was analogous to a full page of text with a client's usage information and etc.

In almost every (bi-monthly) run, the last n pages would not print, and because we streamed them all, the whole thing would have to be rerun and we'd do some kind of manual scramble to get them going. I'd even invented a reverse-sort for my report cursor so when (not if) the normal job died at least we could run the last n in "reverse" and then manually put them on the bottom of the normal print stack.

I think a really good way to avoid all of that is break the job up. Even if 80% are in the same major ZIP code, maybe you can break them up on ZIP+4. Or think about ZIP codes as a seperate issue and for the straight output task, just break them into n record/page jobs. Since you already have your code that runs the entire 10,000 page report, why not add a couple lines of extra SQL to filter out just a batch of them and seperate them and then blast them at will, with different exe's on different machines going to different printers. I think you might run into problems even just streaming based on a timer - if the printer is out of paper and the report person goes to lunch, 3-6 batches will get jammed out of your single .exe which might kill whatever server you might be using for printing. Or something like a simple paper jam, again if things get backed up or caught on the hardware side you will have no recourse but to run the entire thing all over again.


>You have some really good ideas, but I feel that they are way to complex and/or labor intensive. I have though about creating a form that will use a timer and spool a chunk of pages every 10 mins or so. Anyway I do plan on divided it up by zip code, but a large chunk of the bills, proabaly 80%, are in the same zip.
>
>Eric
>
>>Assuming your reports are actually run from a report cursor and not stepping through multiple tables or doing something equally complex/obtuse. Instead of running the report from the cursor, cut the cursor up into 100 tablesof 100 rows, or 10 tables of 1000 rows. Split up on recno() or ZIP code or something.
>>
>>Take each table, a copy of the report, and create a small exe to push the report to the printer. Copy them to a directory (off the root of your main dir or something) and now you have a "report unit".
>>
>>Assuming you have 3 systems each with a different default physical print device. Evenly distribute "report units" to each system and create a .cmd script or .wsf that will launch each report .exe sequentially.
>>
>>Now, if any single "report unit" fails, you have a discrete/managed unit that you can use to re-run its entire job until it's done, and success/failure won't affect the others.
>>
>>
>>Another thing you could do is use message queueing technology. Install MSMQ and Queue Triggers if you have MSMQ (on Win2K), you already have both if you hae WinXP or .NET Server Beta. Decide how you will cut up the report cursor. Have the main VFP app get the first chunk into a table and stuff that table into a queue, it can then wait a reasonable period and put other chunks into receiving queues.
>>
>>Create the same mini report-running stuff as in my first idea. Create a queue trigger that launches thre report runner when an item goes into the queue. The report runner pulls the table out of the queue and runs the report. If it's successful, it deletes the item from the queue. If it fails for some reason, you can restart the queue trigger service and the trigger will fire again and the report runner will try again.
>>
>>The receiving queue can be on the local machine or it can be on a remote machine. One of the nice things about MSMQ is that queues can (should) be TCP-based, so you could conceivably make any machine a "report runner" with very little work, and from a single machine pass work to hosts all over the place - queues can even go over the internet (but that's another subject).
>>
>>
>>
>>One other thing that just jumped into my head was work through IIS. You'll still want to have report runner exe's, but include code to use the WinHTTP library for communication. The exe's live on machines that will actually print.
>>
>>Cut the report into table sections and save them all to a single directory in IIS. Each report-runner exe does an HTTP GET for "reportreq.asp?action=getnext". Now reportreq.asp does an Application.Lock, reads a table into a stream, changes the file extension to ".inuse". Application.Unlock, and Response.BinaryWrite the stream to the caller. The calling/reportrunner .exe saves the binary stream to local disk, so it now has a table, and it can run the report.
>>
>>When it is done, it can HTTP GET for "reportreq.asp?done=tablename". The ASP page will then know the report has run to completion, and can rename the file prefix to "complete".
>>
>>Actually I think doing it this way is the best. Again, you get discrete work units that you can rerun if they fail, and printing is asynchronous so you aren't killing any spooler.
>>
>>
>>
>>>I have a client that has a billing report that is about 10,000 pages long. There is one page for every record, actually. Anyway, I'm curious as to how I should divide this up. Should I let them spool 10,000 pages to the print queue? I really don't want to do this. Should I force them to divide it up? If so, how? Mabey by page number, or state, or zip code? Has anyone else encounterd this issue? If so, how did you handle it?
>>>
>>>TIA !
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform