As part of a hobby-project I have a number of servers on my network that measure various parameters (both system statistics and domotica) and store the results (mostly as 5-minute averages) in various tables in a MySQL database running on one of the servers.
Using bash each server periodically queries the MySQL server to get a subset of the stored data in a local file. Then, I use python and gnuplot to turn the raw data in some nice trend graphs.
I currently make hourly, daily and weekly graphs. I already have a lot of data and I would like to make a couple of yearly graphs, however, I'm seeing that queries for a week's worth of data already take some time (query + network + local diskwrites) so I'm hesitant to proceed.
I was wondering if you guys have experience with this type of problem and know of / have experience with standard solutions that I have not found.
Are there magic queries, compression techniques or local storage solutions that I might want to investigate? The goal being to reduce the network load and reduce the MySQL server workload by preventing it to have to server the same data over and over again.
memcachedsounds interesting. I' put that on my to-check list. – Mausy5043 Sep 14 '16 at 15:56