My scenario is likely common: Many product test stations sending test data to a centralized (master) database. (In my case the product being tested is a high frequency transceiver)
My goal: Keep the test stations running if the network connection goes down and the test stations cannot connect to the database. i.e. I need a way to save test data locally, and then to write it to the database when the test station server (the database client) can connect to the database server again. Once the data is written to the central DB server I want to delete it from the test station server.
What I'm looking for: An overall approach and the terminology I need to know so that I can find what I need to read and understand. I don't need a lot of details beyond that, but feel free if you like! :)
What I know already: SQL, SQL Server, Database Design, SSIS, SQL Server Agent.
What I've found so far: Suggestions on this site and elsewhere to use SSIS and replication. (I doubt I'll be using SQL Server. I'll most likely use an open source DB.) (Edit: From what J.D. has said I will likely try to talk my employer into paying for SQL Server once I understand how its features allow me to do what I have outlined in this question)
What I would do if left to my own devices:
- Create a centralized database on a server.
- Create local databases on the clients, the test stations, with the same schema as the DB server. I'd need a way to propagate schema changes from the DB server to all of the clients)
- Have my hardware test application write to the local DB.
- Have a process running somewhere that transfers data to the central DB, deleting the local data as it goes.
I know I can write code to do this, but I suspect that I'd be reinventing the wheel, and my wheel would likely be sub-optimal. I'll read more on replication to see if that solves my problem.
Thanks, for any help!