Data logging to flat file (CSV) and SD card longevity

Im following the https://www.wago.com/global/lp-thank-you-onlineseminar-data-logging video and creating a data logger on my PFC300 controller to write to CSV files on an SD Card. I am giving the user a settable option to write a data point every 1 - 60s for process data recording. For this I am using the WagoAppFileDir.FbGeneralFile_XXXX_mod function blocks. At any given point in time, I have up to 3 separate csv files that are being written to: alarm log, audit log, process data log. The process data will see the most extensive writes (in worst case one entry every second while the system is in run mode) followed by the audit log which essentially records an entry for every button press on the visualization. Finally, the alarm log will see the least with an entry made every time the status of an alarm is changed.

What are the best practices when writing to an SD card, should I close the file after every write and reopen only when the next entry is to be made or can I leave the file open and just keep appending until the data recording session is over?

Obviously at some point the SD card might become corrupt due to the many write cycles. Is there some rule of thumb for how long an industrial SD card might last / how many read / write cycles it can withstand? The web seems to come up with up to 100,000 write cycles for industrial grade cards and talks about wear leveling. If I am writing to the card every second, that translates to less than 2 days of continuous storage! The SD card is not the primary data storage method here and not recommended for data retention but I was hoping to provide it as a means to give the user an easy way to store data as a local backup.

Are there any suggestions how to prolong the life of the SD card in such an application? Is my reasoning correct with the lifetime of the SD card? I was thinking about storing data to a large buffer array locally and transferring say every 5 min to the SD card but I am not sure if that might have adverse effects on the controller memory? Also it is not clear to me what constitutes a write cycle - if I pass the array into the File_Write_mod function, is each array entry considered a write? If so, the buffer method doesnt help me either.

Any guidance would be greatly appreciated.

First, I would suggest to use only industrial grade SD cards like WAGO’s.
They have up to 20 000 write cycles per cell and up to 10 years data retention.
The max number of writes is a number per cell, and not for the whole SD card of course.
The lifetime will depend on many factor like the free space available (the more there are the more the writes are “spread” on different cells), the temperature, the number of writes indeed.
I would also recommand to use a buffer that you can place in RETAIN memory, then the data won’t be lost if a power outage occurs. It will reduce the number of writes a lot.

Additional hint, consider this to take a look to the status of yoru SD card (if it supports S.M.A.R.T)

Thanks @quenorha for the explanation. Makes sense with the industrial grade SD cards.

How would the buffer look in practice? I am assuming that every call to the FbGeneralFile_Write_mod function constitutes a write. Each line in my process data file is close to the 255 character limit for a string. If I create a buffer array in retain memory containing a new entry for each data record line, I would still need to iterate through to pass each of those elements to the FbGeneralFile_Write_mod function to be written. Or can I pass the entire array and that is considered a single write? This is where I dont completely understand what constitutes a write. If my assumption is correct, wouldnt passing all the elements of that array be just as bad as writing to the SD card every second? If I am able to pass the entire string array in one call of the FbGeneralFile_Write_mod function and that is considered a single write then I see how that makes sense with passing the data points to a buffer.

What I have done in the past and seems rather easy to implement is create an array of your data types, in this case you would have 3 different UDT’s. I will then make an array of those data types of, lets say 100 items. When the array gets full, simply copy the “live” array to the “write” array. Reset your pointer to your “live” array, and then start a write process to the SD card. This limits the amount of open and closes you do to the file, and is actually a little faster.

Totally agree about the mechanism.

Thanks all for the insights. I chatted with support to get some help on how best to implement the writing of the array to a CSV file as I ran into issues with gibberish (ascii characters) written at the end of each string in the CSV. This was because the code was writing the complete string and there were null characters that the code is not able to recognize and cut off when writing the array by byte and byte length via FbGeneralFile_Write_mod. A different function like FBGeneralFile would need to be used to pass the elements of the array of strings by means of iterating through each element, probably the original function can be used too if iterating through the elements.

Someone who knows what is happening in the background with these functions can feel free to correct me if I am wrong but in the end, I dont understand how the use of a buffer is any different from writing to the SD card directly without a buffer. If I am writing a string say every second to the SD card, how is this different than if I write 100 strings to the SD card every 100 seconds? It is the same number of writes in m opinion.

In discussions with support, the WAGO data logger works the same way, it cyclically writes a new line at the desired interval and this has been used in systems for many years without issues. Of course it depends on the frequency one is writing to the card. If I am logging an entry every second, I will damage the card at some point. It is best to increase the time between writes as much as possible to extend the life of the card.