+79
Pending Review

Add support for MariaDB/MySQL or another DB engine for scalability

jp10 5 years ago updated by Alex Heylin 3 months ago 25 3 duplicates

The weakest link seems to be the SQLite DB. Once the DB gets to 1GB size, the SC application becomes unstable.

Available in Version:

Duplicates 3

PLEASE do this! The existing SQLite DB engine is a real problem. Even when DB size is low it's not great - as DB size grows it becomes a nightmare. Try running 90 day advanced auditing - every time server is restarted SC does 30-60 mins of DB maintenance before it's usable. Please also give option NOT to ryun DB maintenance on start up. Currently I'm trying to run an audit report of 30 days of basic auditing and it dies because there's too much data.

I just had my session.db corrupt because I was trying to read the session data for reporting and the DB browser died. This NEVER happens with proper DB engines. Thankfully that file only contains a month of auditing data that my boss wants a report on by the end of the day... Nothing important then! :-(

Pending Review
Pending Review

Definitely needed! Our installation is struggling with SQLLite.

+1

This is a definite need - we use 5.6 right now on a MSSQL server and it handles A LOT better than SQLite could ever do!

I will second that, R.peters. We use an MSSQL as well now and the performance difference is quite large. SQLite was always a bottleneck as the DB grew to thousands of sessions, and frequently gave us problems during SC upgrades. Many of our clients require us to send them scheduled reports that tell them which of our admins has accessed their machines, when, and why. MSSQL has allowed us to easily create SSRS audit reports to meet this need, and having this officially built-in would be fantastic.

Agreed.

There is a forum post about doing this, but AFAICT it's not officially supported or even documented:


http://forum.screenconnect.com/yaf_postst4414_Microsoft-SQL-Server-Support.aspx

This should be a high priority the linux version realy would be much better off with mysql

If this is running colo on the labtech server then their already is an mysql server to work with.


The sqllite code just cant handle the load, i cant even view my audit logs becuase the job times out

Really need this functionnality !

Considering for Future Release

Also, Need support POSTGRESQL database urgently!.

Please allow for Microsoft SQL Server as a supported database platform for CW Control

+3

This is LONG overdue for a production product that many, many companies rely on for great remote connectivity to clients. Even if the first step is just ability to use MS SQL, this would be a huge improvement. Plus, it's not that it wouldn't work with it as the queries are almost identical (evidenced by the fact that other customers have switched to it). However, as it's not officially supported, we do not want to risk any potential "out of support" issues to switch until we know that the scenario will be supported, with updates as well. MS SQL Express supports 10GB databases at this point, so even if SQL licensing is too much a burden, this is a no cost option. Please at least give us the option of choosing the built-in SQLite or a MS SQL server at minimum. Thanks!

The application gets slower the more unattended agents that we have. Please add MS SQL support!

Also get this problem, have on prem unlimited licences but the database fills up quickly over time and cant cope. Periodically we need to manually purge data out of the session.db

Hi Rob, 

What database maintenance plan do you currently have set up? 

Thanks!

Description

Support: Delete ended sessions older than 1 day

Support: Delete Host and Guest connections older than 1 day

Support: Delete All Events events older than 1 day

Meeting: Delete ended sessions older than 1 day

Meeting: Delete Host and Guest connections older than 1 day

Meeting: Delete All Events events older than 1 day

Access: Delete ended sessions older than 1 day

Support: Delete Host and Guest connections older than 2 days

Access: Delete All Events events older than 1 day

Delete session captures older than 1 day

Compact database files

Delete session captures older than 30 days

Access: Delete All Events events older than 14 days

Support: Delete All Events events older than 2 days

Support: Delete ended sessions older than 1 day
+1

This needs to be added to the product asap. The current structure is unmanageable with more than 30,000 access sessions. Retention is already set to only a few days. When the db file gets to 2ish GB the web app completely crashes out. 

+1

I am very surprised as well that this has yet to be implemented. SQLite is just that lite. "SQLite emphasizes economy, efficiency, reliability, independence, and simplicity" to quote directly from the product page. It is not meant to scale to large data. Large data is meant for a RDBMS (relational database management system) that can reindex, defrag, etc.
Can SQLite do everything I have mentioned - sure and depending on memory (RAM) constraints, it can either do it well or as many have seen, crash hard. I would suggest that they makes this change. Otherwise there are always other RATs in the world (non-viral).

+1

** BUMPITY BUMP BUMP  ** 

We're only running 1000 agents, and every time I try to run an audit report for a 4 day period it times out and crashes the browser.  It's impossible to run this report, I have to run one day at a time. This NEEDS a proper DB. My vote's for mysql, but honestly I dont' care as long as it's not SQLShite which is totally unsuited to operating this at scale.  We'd like advanced auditing, but every time we've tried turning it on it makes the system unusable due to DB size.

Sadly, this product's major downfall seems to be the database. We, along with many others as I see, have issues with SQLite simply not being able to handle it. I personally would love to see a MySQL or PostgreSQL implementation on ScreenConnect to tighten up this product. 

If ported to MySQL, devs would need to be careful to ensure proper full backup handled properly (given lack of VSS).  We recently had hell with LabTech because their DB backup process is only partial, and we didn't fully understand the implications of lack of VSS support (restored server had completely useless DB) - so SC need to be careful to design something better - even if it's just dump the WHOLE database to a file-per-table zip file at a configured time. 

Need this as well. Any blip with that file and our company is dead in the water. My file gets to 2gb almost monthly even with maint plan in place.

Support fro something else would be a huge help. 

I still think this product needs this (vital for advanced auditing), but here are some steps that will help you if you're struggling with DB size issues. SQLite relies entirely on OS caching the file - there's no way to force it into RAM like with a DB engine. 

1. Put the whole server on SSD, local NVMe for preference.  You should probably use RAID but make your own priority judgement here.  This should allow you to get away with less RAM, and just make the whole server fly. 

2. If you've still got issues, throw RAM at the problem. I'd estimate you need is:
RAM required by OS at idle (after running for a few days - just take whatever GB it's using now)
+
1.2 x size of your DB file

at least 2GB extra. 

3. Consider if you really need advanced auditing and / or to keep data for as long as you do now.  Could you export more frequently and clear the DB down to keep the size down.

My rule for SC is if Windows resource monitor doesn't show at least 2GB of "free" (light blue - NOT the "available") RAM with the server under the highest load then you need to add more. Highest load may be when you're running reports etc. 


Disclaimer - some of what follow is probably not technically "true", but is the best I can recall from the last time I had to touch this - combined in a way as to emphasise how I worked around the issue. 

Remember you won't see a process using the RAM the DB is using - it's using file cache RAM, which isn't shown as a belonging to a process, but is included in the dark blue section of the graph. If the server is short on RAM the first thing the OS will sacrifice is the file caching you're relying on to speed up your DB access. Also check the Disk section of resource monitor while generating your "high load" scenario - if you see any significant read activity of the DB file then add RAM because you want it able to serve all reads from cache. As a test, I just exported all our audit data and all you could see on the server was CPU at 80% with no file reads.  It was quick and returned so much data Chrome ran out of memory and killed the tab.

If that all sounds expensive, for us we have 7GB RAM, 2 vCore & 50GB SSD in a public cloud for under GBP £20 (USD $25) pm and this happily runs 1,500 agents as long as we don't turn on advanced auditing (which is suicidal in our experience). Shop around. If you want to use advanced auditing or keep the data for any length of time - size accordingly and hope.