>I was just working on an application with automation functionality, and we have quite some performance issues, especially when closing the Word server it sometimes takes up to 10 seconds to finalize the quit command. Therefore I am now thinking to change it so the application instantiates the server at the first access and then keeps reusing the server until the application closes and then we can force a task kill. What do you think about this approach, since I just read you would rather open and close the server at demand. What it does now in our application, it processes several Word documents and with each document it instantiates and closes the Word server.
I've worked on a doc2pdf app for a number of years (actually just a few weeks net time) and its second version ran Word (2007 or later) to produce pdf. It's a pig. It eats memory and doesn't release it. For every document it opened and closed it would eat some 20K, so as a precaution I eventually had to kill it after 200 documents (about the lowest number it ever got to before crashing) and create a new instance.
>BTW, the slow closing happens only on some machines, and only if there is no internet connect. I could not find any specific reason for that. The only cure might be re-installing Office and/or Windows, but that same problem could of course occur at a client, so that would not really be a solution we could offer.
So it's calling home to report, eh? I'd have a good web sniffer to investigate what it sends. Also, how about a fake connection, i.e. something that would loop back to 127.0.0.1 on any attempt to make a web call on those machines. Perhaps just something in hosts file would suffice.