8/29/2016 11:55 AM | |
Joined: 6/3/2014 Last visit: 3/20/2025 Posts: 660 Rating:
|
Hello sjm_go, if you click on your data log and then look to ist properties it tells you under "General > Data records per log" the maximum size of your log (the value is depending on how you configured the log, of course). Just add them and you shoud have a good estimate.) If you need info about calculation the logging time or cycle time, see: https://support.industry.siemens.com/tf/ww/en/posts/129544/ |
~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ★ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ |
|
8/29/2016 1:33 PM | |
Joined: 8/3/2009 Last visit: 3/20/2025 Posts: 15045 Rating:
|
Hello sjm_go I suppose you have a comfort Panel because of WinCC Advacned in TIA portal and the Memory card (I you had a Runtime on PC then you do not neet the Memory Card) So if we look at the performance features of the comfort Panels then you can use maximum 50000 Entries in on e log. -> you want to store 90 days -> 7776000seconds-> 7776000/50000-> 155s -> this means the measure point could only be stored every (round about) 3 minutes to reach teh 90 days. It is very hard to store such a long amount on a Comfort Panel. If you use an Advanced Runtime on a PC you can use 500000 Entries at one log. Another Way is to use this solution: Long-Term Data Archiving with WinCC Runtime PC and SIMATIC HMI Operator Panels with WinCC (TIA Portal) https://support.industry.siemens.com/cs/de/en/view/109477071 I hope it helps to understand the problem in this UseCase. Bye Murof |
Last edited by: Murof at: 8/29/2016 1:35:55 PMIf this Information really helps, you could use the Rate function |
|
This contribution was helpful to
1 thankful Users |
9/2/2016 5:26 PM | |
Joined: 6/3/2014 Last visit: 3/20/2025 Posts: 660 Rating:
|
Hello sjm_go, it might work, but it's not guaranteed:
Also, the "number of entries per log" is 20k with a TP1200. If you want to have less time between the measurement points you can use long time archiving.
This means you have a log < 90 days, but you use several logs, which in sum are at least 90 days long. Example: 90d = 60s*60*24*90 = 7.776.000s 7.776.000s / 20.0000 = 388,8 s = 6,48 minutes --> This is too long. a) 129.600 / 20.000 = 6,48 --> Number of logs you would need per tag, if you want to log the tag every minute. --> 7 logs Each log is around 12-13 days long. b) 1.009.870,13 / 20.000 = 50,5 --> Number of logs you would need per tag, if you want to log the tag every 7,7s. --> 51 logs Each log is around 1,78 days long. c) 900.000 / 20.000 = 45 --> Number of logs you would need per tag, if you want to log the tag every 8,64. --> 45 logs Each log is 2 days long. You might need to save the logs on a network location, as I don't know how much memory this will take, but it feels like a bit. In case c) that would be around 4884KB/log * 45 = 219,78 MB for each tag over 90 days. 219,78 MB/ tag * 100 tags = 21,978 GB --> so no SD Cards or USB stick in this case (and no mobile HDD either) --> save at Network path or: |
Last edited by: H.J. at: 9/2/2016 5:47:30 PMLast edited by: H.J. at: 9/2/2016 5:48:43 PM~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ★ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ ~ |
|
Follow us on