Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
VFP app with SQL Server works faster after reboot
Message
From
14/03/2024 07:33:34
 
General information
Forum:
Visual FoxPro
Category:
Troubleshooting
Miscellaneous
Thread ID:
01687680
Message ID:
01687718
Views:
29
>>>>>>>The customer sent me an email indicating that the server (hosting SQL Server) has 8 GB. And the value in the Maximum memory is 2147483647 MB. Does it mean they have no limitation? Or, should the Maximum memory number should be changed to?
>>>>>>
>>>>>>?2147483647/1024**2
>>>>>>gives 2048, so that limitation is set to two terabytes. It's still a limitation, although a bit above the physically possible.
>>>>>
>>>>>I wonder if this limitation (very high) actually sets Max Limit to No limit (practical) and therefore causes the problem where the SQL Server stops after some time.
>>>>>Which brings me to a question (to which I do not know the answer); what would you set the Maximum memory value in order to set a limit and prevent the SQL Server from stopping?
>>>>
>>>>Just shut down the SQL server and check how much available memory the system has then - use task manager to see that. Then set the limit to about two thirds of that. It's quite possible that the SQL engine has all the memory it may need, but the rest of the machine is starved, so the filesystem, networking etc may suffer and thus become the bottleneck. So leave some air for the system.
>>>>That would be where I'd start.
>>>
>>>Thank you for your input and for your suggestions. So far I told the customer to reset the SQL Server; so that they can go ahead with the application (currently it was stuck and not usable). As far as memory, I told them that most likely it is the number of the Max memory set in the SQL Server that causes the problem. But I did not want to suggest to them what number to use (even though I believe you know what you are talking about). The customer is a big organization and they should have a DBA. I told them to check with the DBA and set the limit to whatever the number he/she suggests. I will see what they do and how it effect the operation.
>>>Again, thank you.
>>
>>If a Windows computer (server or workstation) slows down to the point of unusability, it's almost always because the disk subsystem is overloaded. Some major causes of this:
>>- Unknown or unexpected heavy disk I/O such as backups running during business hours, other apps running on the server etc.
>>- Insufficient free RAM memory, forcing memory to be paged to disk
>>- Low free disk space (under 5%) causing file fragmentation, especially with mechanical HDDs
>>
>>I would do these checks, in this order, while the server is functional:
>>1. Check free space on server disks using File Explorer
>>2. Use Task Manager to check RAM memory usage. You can also check if anything is pinning the CPUs but on modern systems that's unlikely unless it's a VM that's underprovisioned with vCPUs
>>3. On the Task Manager Performance tab click on "Open Resource Monitor" at the bottom. When RM is open, go to the Disk tab. Scroll to the bottom and expand the "Storage" section if needed, then for the logical disks review the Disk Queue Length and Active Time % columns:
>>- on a system with a strong enough disk subsystem, Disk Queue Length should almost always be between 0 and 1, and Active Time under 10%.
>>- a heavily loaded but acceptable system will have 100% active time and may have DQL hovering around 3 - 5. Anything higher is overloaded
>>- I have seen grossly overloaded systems with DQL consistently over 15 and peaking over 200
>>- if high DQL is found, look at the Disk Activity section in RM (e.g. Total B/sec) or go to the Processes tab in Task Manager and sort them highest to lowest by Disk utilization to find the culprit(s)
>>
>>Since its early days SQL Server has been marketed as "self-tuning" and not requiring a DBA for many common use cases. Yes, by default it will use as much RAM as it can get hold of, but modern versions are pretty good about returning RAM to Windows if the OS is busy servicing other apps or doing other things.
>>
>>A while back I was working with an HP server, 8GB RAM, 2x 1TB mechanical HDD in RAID1. It had a SQL Server 2012 Standard instance and also ran a couple of RDS sessions with a VFP fat client app. I don't recall seeing that server go over 7 GB RAM in use and it never needed any SQL tuning. Client app performance was good even over a slow DSL remote connection.
>>
>>More recently I set up a Hyper-V host with 4x 4TB mechanical HDDs in RAID10 on a high-end caching RAID controller. The largest VM is provisioned with 48GB RAM and 10 vCPUs. It has an instance of SQL Server 2019 and supports a parametric CAD app with ~10GB of databases. Performance is excellent and I have never seen that VM report over 10GB total RAM usage (no custom tuning). The client is happy to let that VM use the rest as disk cache, and if SQL Server decides it temporarily wants more for some inscrutable purposes, it's available for that, too.
>>
>>So, a SQL Server Standard instance running on a server with 8 GB RAM can be OK as long as it's not overwhelmed by lots of other apps or services. If it's slow, do some basic research and find the bottleneck(s).
>
>Thank you very much for a detailed message. I will read it carefully tomorrow (too tired right now to focus). And maybe will send some of the items you suggest to the customer IT.

Al is on the mark as usual - but consider time and cost.

Actually I'd ask the customer if the machine accepts ANY further RAM.
Quadrupling the physical memory should cost less than the time
spent searching for different software settings - and if the machine
only has 8GB physical allowed, CPU, RAM and board are past their prime.

Searchin for the culprit after having ample RAM is usually easier as well.

regards
thomas
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform