2

I plan to do a database for more machines. Each machine will have multiple sensors mounted that store data like temperature, current, power, fault, manual mode and so on every minute. Each machine consists of multiple parts and the same data sets are stored for each part.

Every part may be divided into two subparts and each part/subpart can have multiple sensors for temperature. So in reality system can cosists of six parts, only part five has two subparts. Each part or subpart has one sensor for a variable stored except temperature. So for example part1 will have one variable for power, current, voltage, manual mode but can have multiple sensors for temperature.

There can be a system with more than 500 variables recorded every minute, and each variable type has a different type of value, e.g., some are integers, some are floats with one decimal some with two decimals. I can't setup one column for this. That's why I want to make the best structure possible but I have never really performed database design since I have always used only simple tables with small data.

I just don't want to make a table with 500 or more columns since I suppose that is not a good idea.

I would do something like this:

sensor_name, type, value, timestamp

I know according to the sensor name from what part/subpart is the reading.

However, with a large machines this table will have like 500 rows for every minute.

Thank you for every idea sugested.

MDCCL
  • 8,520
  • 3
  • 30
  • 61
  • 3
    500 rows per minute with one saving thread? That's tiny... – Akina Oct 07 '19 at 09:45
  • and what about the primary key? I've always used timestamp as a PK but since there will be a lot of the same timestamps should I just use an index for a primary key? – Matej Petráš Oct 07 '19 at 10:07
  • PK may be (sensor_name, timestamp) I think... but synthetic PK seems to be preferred (for me at least) in that case. I've always used timestamp as a PK It looks like "no matter what, as long as it existsed". – Akina Oct 07 '19 at 10:13
  • 1
    I've always used timestamp as a PK This makes impossible to insert 2 values in one query (will fail because of duplicate value). – Akina Oct 07 '19 at 10:23
  • 1
    each variable type has different type of value. Some are integers, some are floats with one decimal some with 2 decimals. I can't setup one column for this – Matej Petráš Oct 07 '19 at 12:16
  • I can't setup one column for this Why? use any of compatible types. VARCHAR/VARBINARY, JSON... – Akina Oct 07 '19 at 12:25
  • Yes I could do that. How bad would the db be if it has 500 columns? since I am processing data with another user interface it is easy for me to have one row with timestamp and all the values. If the db has one value per row I will have to process the data again and I think it would be rather difficult – Matej Petráš Oct 07 '19 at 14:16
  • The dups are suspicious -- 7 years old; one is specific to SQL Server. – Rick James Oct 14 '19 at 23:57
  • If you get all 500 values at once, then a table with ~502 columns (sensors + timestamp + (maybe) id) is no problem, especially only once a minute. – Rick James Oct 15 '19 at 00:03

0 Answers0