{"id":255000,"date":"2024-10-19T16:51:13","date_gmt":"2024-10-19T16:51:13","guid":{"rendered":"https:\/\/pdfstandards.shop\/product\/uncategorized\/bs-en-61850-102013\/"},"modified":"2024-10-25T12:17:28","modified_gmt":"2024-10-25T12:17:28","slug":"bs-en-61850-102013","status":"publish","type":"product","link":"https:\/\/pdfstandards.shop\/product\/publishers\/bsi\/bs-en-61850-102013\/","title":{"rendered":"BS EN 61850-10:2013"},"content":{"rendered":"
IEC 61850-10:2012 specifies standard techniques for testing of conformance of client, server and sampled value devices and engineering tools, as well as specific measurement techniques to be applied when declaring performance parameters. The use of these techniques will enhance the ability of the system integrator to integrate IEDs easily, operate IEDs correctly, and support the applications as intended. The major technical changes with regard to the previous edition are as follows: – updates to server device conformance test procedures; – additions of certain test procedures (client device conformance, sampled values device conformance, (engineering) tool related conformance, GOOSE performance).<\/p>\n
PDF Pages<\/th>\n | PDF Title<\/th>\n<\/tr>\n | ||||||
---|---|---|---|---|---|---|---|
7<\/td>\n | English CONTENTS <\/td>\n<\/tr>\n | ||||||
11<\/td>\n | INTRODUCTION <\/td>\n<\/tr>\n | ||||||
12<\/td>\n | 1 Scope 2 Normative references <\/td>\n<\/tr>\n | ||||||
13<\/td>\n | 3 Terms and definitions <\/td>\n<\/tr>\n | ||||||
15<\/td>\n | 4 Abbreviated terms <\/td>\n<\/tr>\n | ||||||
16<\/td>\n | 5 Introduction to conformance testing 5.1 General <\/td>\n<\/tr>\n | ||||||
17<\/td>\n | 5.2 Conformance test procedures 5.3 Quality assurance and testing 5.3.1 General <\/td>\n<\/tr>\n | ||||||
18<\/td>\n | 5.3.2 Quality plan <\/td>\n<\/tr>\n | ||||||
19<\/td>\n | 5.4 Testing 5.4.1 General <\/td>\n<\/tr>\n | ||||||
20<\/td>\n | 5.4.2 Use of SCL files 5.4.3 Device testing Figures Figure 1 \u2013 Conceptual conformance assessment process <\/td>\n<\/tr>\n | ||||||
21<\/td>\n | 5.5 Documentation of conformance test report <\/td>\n<\/tr>\n | ||||||
22<\/td>\n | 6 Device related conformance testing 6.1 Test methodology 6.2 Conformance test procedures 6.2.1 General 6.2.2 Test procedure requirements <\/td>\n<\/tr>\n | ||||||
23<\/td>\n | 6.2.3 Test structure 6.2.4 Test cases to test a server device Figure 2 \u2013 Test procedure format <\/td>\n<\/tr>\n | ||||||
24<\/td>\n | Figure 3 \u2013 Test system architecture to test a server device Tables Table 1 \u2013 Server documentation test cases <\/td>\n<\/tr>\n | ||||||
25<\/td>\n | Table 2 \u2013 Server configuration test cases Table 3 \u2013 Server data model test cases <\/td>\n<\/tr>\n | ||||||
26<\/td>\n | Table 4 \u2013 Association positive test cases <\/td>\n<\/tr>\n | ||||||
27<\/td>\n | Table 5 \u2013 Association negative test cases Table 6 \u2013 Server positive test cases <\/td>\n<\/tr>\n | ||||||
28<\/td>\n | Table 7 \u2013 Server negative test cases <\/td>\n<\/tr>\n | ||||||
29<\/td>\n | Table 8 \u2013 Data set positive test cases <\/td>\n<\/tr>\n | ||||||
30<\/td>\n | Table 9 \u2013 Date set negative test cases <\/td>\n<\/tr>\n | ||||||
31<\/td>\n | Table 10 \u2013 Service tracking test cases Table 11 \u2013 Substitution positive test cases <\/td>\n<\/tr>\n | ||||||
32<\/td>\n | Table 12 \u2013 Setting group positive test cases Table 13 \u2013 Setting group negative test cases <\/td>\n<\/tr>\n | ||||||
33<\/td>\n | Table 14 \u2013 Unbuffered reporting positive test cases <\/td>\n<\/tr>\n | ||||||
34<\/td>\n | Table 15 \u2013 Unbuffered reporting negative test cases <\/td>\n<\/tr>\n | ||||||
35<\/td>\n | Table 16 \u2013 Buffered reporting positive test cases <\/td>\n<\/tr>\n | ||||||
37<\/td>\n | Table 17 \u2013 Buffered reporting negative test cases <\/td>\n<\/tr>\n | ||||||
38<\/td>\n | Table 18 \u2013 Log positive test cases Table 19 \u2013 Log negative test cases <\/td>\n<\/tr>\n | ||||||
39<\/td>\n | Table 20 \u2013 GOOSE publish positive test cases <\/td>\n<\/tr>\n | ||||||
40<\/td>\n | Table 21 \u2013 GOOSE subscribe positive test cases Table 22 \u2013 GOOSE management positive test cases Table 23 \u2013 GOOSE publish negative test cases <\/td>\n<\/tr>\n | ||||||
41<\/td>\n | Table 24 \u2013 GOOSE subscribe negative test cases Table 25 \u2013 GOOSE management negative test cases Table 26 \u2013 Control test cases <\/td>\n<\/tr>\n | ||||||
43<\/td>\n | Table 27 \u2013 SBOes test cases <\/td>\n<\/tr>\n | ||||||
44<\/td>\n | Table 28 \u2013 DOns test cases Table 29 \u2013 SBOns test cases <\/td>\n<\/tr>\n | ||||||
45<\/td>\n | Table 30 \u2013 DOes test cases Table 31 \u2013 Time positive test cases <\/td>\n<\/tr>\n | ||||||
46<\/td>\n | Table 32 \u2013 Time negative test cases Table 33 \u2013 File transfer positive test cases Table 34 \u2013 File transfer negative test cases <\/td>\n<\/tr>\n | ||||||
47<\/td>\n | 6.2.5 Test cases to test a client device Figure 4 \u2013 Test system architecture to test a client device Table 35 \u2013 Network redundancy test cases <\/td>\n<\/tr>\n | ||||||
48<\/td>\n | Table 36 \u2013 Client documentation test cases Table 37 \u2013 Client configuration test cases Table 38 \u2013 Client data model test cases <\/td>\n<\/tr>\n | ||||||
49<\/td>\n | Table 39 \u2013 Association positive test cases <\/td>\n<\/tr>\n | ||||||
50<\/td>\n | Table 40 \u2013 Association negative test cases Table 41 \u2013 Server positive test cases <\/td>\n<\/tr>\n | ||||||
51<\/td>\n | Table 42 \u2013 Server negative test cases Table 43 \u2013 Data set positive test cases <\/td>\n<\/tr>\n | ||||||
52<\/td>\n | Table 44 \u2013 Data set negative test cases <\/td>\n<\/tr>\n | ||||||
53<\/td>\n | Table 45 \u2013 Service tracking test cases Table 46 \u2013 Substitution test cases <\/td>\n<\/tr>\n | ||||||
54<\/td>\n | Table 47 \u2013 Setting group positive test cases Table 48 \u2013 Setting group negative test cases <\/td>\n<\/tr>\n | ||||||
55<\/td>\n | Table 49 \u2013 Unbuffered reporting positive test cases <\/td>\n<\/tr>\n | ||||||
56<\/td>\n | Table 50 \u2013 Unbuffered reporting negative test cases Table 51 \u2013 Buffered reporting positive test cases <\/td>\n<\/tr>\n | ||||||
58<\/td>\n | Table 52 \u2013 Buffered reporting negative test cases Table 53 \u2013 Log positive test cases <\/td>\n<\/tr>\n | ||||||
59<\/td>\n | Table 54 \u2013 Log negative test cases Table 55 \u2013 GOOSE control block test cases Table 56 \u2013 Control general test cases <\/td>\n<\/tr>\n | ||||||
60<\/td>\n | Table 57 \u2013 SBOes test cases Table 58 \u2013 DOns test cases <\/td>\n<\/tr>\n | ||||||
61<\/td>\n | Table 59 \u2013 SBOns test cases Table 60 \u2013 DOes test cases <\/td>\n<\/tr>\n | ||||||
62<\/td>\n | Table 61 \u2013 Time positive test cases Table 62 \u2013 Time negative test cases Table 63 \u2013 File transfer positive test cases Table 64 \u2013 File transfer negative test cases <\/td>\n<\/tr>\n | ||||||
63<\/td>\n | 6.2.6 Test cases to test sampled values device Figure 5 \u2013 Test system architecture to test a sampled values publishing device <\/td>\n<\/tr>\n | ||||||
64<\/td>\n | Figure 6 \u2013 Test system architecture to test a sampled values subscribing device Table 65 \u2013 Sampled values documentation test cases <\/td>\n<\/tr>\n | ||||||
65<\/td>\n | Table 66 \u2013 Sampled values configuration test cases Table 67 \u2013 Sampled values datamodel test cases <\/td>\n<\/tr>\n | ||||||
66<\/td>\n | Table 68 \u2013 Sampled value control block test cases <\/td>\n<\/tr>\n | ||||||
67<\/td>\n | Table 69 \u2013 Send SV message publish test cases Table 70 \u2013 Send SV message subscribe positive test cases <\/td>\n<\/tr>\n | ||||||
68<\/td>\n | 6.2.7 Acceptance criteria 7 Tool related conformance testing 7.1 General guidelines 7.1.1 Test methodology Table 71 \u2013 Send SV message subscribe negative test cases <\/td>\n<\/tr>\n | ||||||
69<\/td>\n | 7.1.2 Test system architecture 7.2 Conformance test procedures 7.2.1 General 7.2.2 Test procedure requirements 7.2.3 Test structure 7.2.4 Test cases to test an IED configurator tool Figure 7 \u2013 Test system architecture to test a configurator tool <\/td>\n<\/tr>\n | ||||||
70<\/td>\n | Table 72 \u2013 ICD test cases Table 73 \u2013 ICD export test cases Table 74 \u2013 SCD Import test cases <\/td>\n<\/tr>\n | ||||||
71<\/td>\n | 7.2.5 Test cases to test a system configurator tool Table 75 \u2013 IED configurator data model test cases Table 76 \u2013 IID export test cases Table 77 \u2013 Negative IID export test case Table 78 \u2013 System configurator documentation test case <\/td>\n<\/tr>\n | ||||||
72<\/td>\n | Table 79 \u2013 ICD \/ IID import test cases Table 80 \u2013 ICD \/ IID negative test case <\/td>\n<\/tr>\n | ||||||
73<\/td>\n | Table 81 \u2013 Communication engineering test cases Table 82 \u2013 Communication engineering negative test case Table 83 \u2013 Data flow test cases Table 84 \u2013 Data flow negative test cases <\/td>\n<\/tr>\n | ||||||
74<\/td>\n | Table 85 \u2013 Substation section handling test cases Table 86 \u2013 SCD modification test cases <\/td>\n<\/tr>\n | ||||||
75<\/td>\n | Table 87 \u2013 SCD export test cases Table 88 \u2013 SCD import test cases <\/td>\n<\/tr>\n | ||||||
76<\/td>\n | 7.2.6 Acceptance criteria 8 Performance tests 8.1 General Table 89 \u2013 SED file handling test cases <\/td>\n<\/tr>\n | ||||||
77<\/td>\n | 8.2 Communications latency 8.2.1 Application domain 8.2.2 Methodology <\/td>\n<\/tr>\n | ||||||
78<\/td>\n | 8.2.3 GOOSE performance test Figure 8 \u2013 Performance testing (black box principle) <\/td>\n<\/tr>\n | ||||||
79<\/td>\n | Figure 9 \u2013 Measure round trip time using GOOSE ping-pong method <\/td>\n<\/tr>\n | ||||||
81<\/td>\n | Table 90 \u2013 GOOSE performance test cases <\/td>\n<\/tr>\n | ||||||
82<\/td>\n | 8.3 Time synchronisation and accuracy 8.3.1 Application domain 8.3.2 Methodology <\/td>\n<\/tr>\n | ||||||
83<\/td>\n | 8.3.3 Testing criteria Figure 10 \u2013 Time synchronisation and accuracy test setup <\/td>\n<\/tr>\n | ||||||
84<\/td>\n | 8.3.4 Performance 9 Additional tests <\/td>\n<\/tr>\n | ||||||
85<\/td>\n | Annex\u00a0A (informative)Examples of test procedure template <\/td>\n<\/tr>\n | ||||||
86<\/td>\n | Bibliography <\/td>\n<\/tr>\n<\/table>\n","protected":false},"excerpt":{"rendered":" Communication networks and systems for power utility automation – Conformance testing<\/b><\/p>\n |